The European Commission wants to turn digital communication apps, such as WhatsApp, iMessage, Instagram, TikTok and X, into mass surveillance tools so that digital communications of all EU citizens, including their live conversations, photos and videos, can be automatically scanned for criminal offences. Even of citizens who are not suspected of any crime. This proposal for a CSAM regulation has been unanimously condemned by hundreds of academics, privacy regulators, and even internal legal experts at the Council of the European Union itself for its gross violation of privacy rights, but also because the technology to be used to implement it is fundamentally flawed. Indeed, artificial intelligence cannot accurately detect criminal activity, but will falsely report millions of innocent citizens as suspects. Meanwhile, criminals easily circumvent this system by deleting these apps from their phones or disappearing on the dark web. While ineffective, European Commissioner for Home Affairs Ylva Johansson’s proposal does mean an end to private digital conversations for everyone and, for obvious reasons, is incompatible with the right to privacy and freedom of communication and the presumption of innocence. On Thursday 14 September, it emerged that there is insufficient support in the Council of the European Union for her proposal.
Worryingly, a day later, on 15 September, the Commissioner commissioned a paid advertising campaign on X (formerly Twitter) targeting the Netherlands, Sweden, Belgium, Finland, Slovenia, Portugal and the Czech Republic, countries that did not want to vote for the current proposal, according to leaked minutes from the 14 September meeting. The campaign, which has been viewed more than four million times, uses shocking images of young girls alongside sinister-looking men, ominous music, and commits a form of emotional blackmail by suggesting that opponents of the proposed legislation would not want to protect children. Equally misleading is its claim that the proposed legislation would be supported by the majority of Europeans based on a survey that highlighted only the benefits but not the drawbacks of the proposed legislation. On the contrary, surveys by research firms YouGov and Novus, which highlighted the drawbacks, showed virtually no support for the proposal among the European population.
To sway European public opinion, however, the European Commission went even further. X’s Transparency Report shows that the European Commission also used ‘microtargeting’ to ensure that the ads did not appear to people who care about privacy (people interested in Julian Assange) and eurosceptics (people interested in ‘nexit’, ‘brexit’ and ‘spanexit’ or in Victor Orbán, Nigel Farage, or the German political party AfD). For unclear reasons, people interested in Christianity were also excluded. After excluding critical political and religious groups, X’s algorithm was set to find people in the remaining population who were indeed interested in the ad message, resulting in an uncritical echo chamber. This microtargeting on political and religious beliefs violates X’s advertising policy, the Digital Services Act – which the Commission itself has to oversee – and the General Data Protection Regulation.
If there is insufficient support for a proposed legislation, the only proper democratic response is to withdraw it or, as Germany suggested, to amend the proposal so that it does have sufficient support, in this case not introducing the unconstitutional “telescreen”, which seems to come straight out of Orwell’s book. And not, as the Commission has now tried to do, put pressure on doubting member states by trying to bend the views of their citizens to its will with tactics eerily reminiscent of the disinformation campaigns during the US elections and Brexit, such as the use of manipulative advertising, misleading statistics and micro-targeting based on religion and belief.
By setting aside European values in its quest to push through a controversial piece of legislation, the Commission is not only doing a disservice to the citizens it represents, but also endangering the foundations of the Union itself. Therefore, the European Commission should take the ad campaigns offline and keep them offline, and refrain from future attempts to bend European public opinion to its will with manipulative disinformation campaigns through illegal ads on social media.
Danny Mekić is a jurist and technologist. He is a PhD candidate at eLaw, Center for Law and Digital Technologies, at Leiden University.
This article appeared in De Volkskrant on 13 October 2023.
Update: after this opinion article appeared in de Volkskrant, X/Twitter censored my account on their platform. My account @DannyMekic and tweets can now no longer be found via the X search engine. So I have been effectively wiped out by Elon Musk and his team, but I don’t know why, how and on behalf of whom they did this – or if this was their own initiative. Although access to my account has since been restored, I will (continue to) investigate further into the cause and actors involved.
Update 2: After my publication, Dr Vera Wilde, a methodologist, wrote an extensive article on the survey results used in the ad campaign. She writes: “The survey is invalid because the instrument was biased. The researchers responsible violated widely accepted professional and ethical standards for conducting survey research, including the relevant code of conduct. And the programme proponents then misrepresented the survey results, as well. This survey and its representation constitute a misinformation campaign. The survey misinformed participants, and the misrepresentation of its results misinforms readers further.”
Links to the relevant X Transparency reports
Update 14/11: X/Twitter seems to have removed the generated transparency reports. Mirror files can be found on GitHub.
- Belgium: https://ton.twitter.com/ads-repository/ads-repository/1711840471926919539.csv
- Czech Republic: https://ton.twitter.com/ads-repository/ads-repository/1711837907810504751.csv
- Finland: https://ton.twitter.com/ads-repository/ads-repository/1711837769683619880.csv
- Netherlands: https://ton.twitter.com/ads-repository/ads-repository/1711840731994665240.csv
- Portugal: https://ton.twitter.com/ads-repository/ads-repository/1711840606450848098.csv
- Sweden: https://ton.twitter.com/ads-repository/ads-repository/1712470444392264139.csv
- Slovenia: https://ton.twitter.com/ads-repository/ads-repository/1712471208632188998.csv
Links to the relevant posts
 https://docs.google.com/document/d/13Aeex72MtFBjKhExRTooVMWN9TC-pbH-5LEaAbMF91Y/mobilebasic and https://edri.org/our-work/open-letter-hundreds-of-scientists-warn-against-eus-proposed-csa-regulation/
 https://netzpolitik.org/2023/juristisches-gutachten-chatkontrolle-ist-grundrechtswidrig-und-wird-scheitern/ and https://www.theguardian.com/world/2023/may/08/eu-lawyers-plan-to-scan-private-messages-child-abuse-may-be-unlawful-chat-controls-regulation
 Links to the relevant posts can be found at the end of this document.
 See, for example, https://ton.twitter.com/ads-repository/ads-repository/1709243560636166341.csv
 https://business.twitter.com/en/help/ads-policies/ads-content-policies/political-content.html and https://business.twitter.com/en/help/ads-policies/campaign-considerations/targeting-of-sensitive-categories.html
 Article 26(3) DSA.