ChatControl: Vested Interests
To grasp how the EU is using noble intentions as a smokescreen for striking at the heart of fundamental freedoms, we need to go back to the origins of the proposed regulation. And it ain't pretty.
Under the pretext of combating child sexual abuse and online exploitation, the European Union is pushing forward a regulation widely known as “ChatControl.” Cumbersome to implement — with deep divisions among EU institutions and, above all, between member states — the proposed CSAM (Child Sexual Abuse Material) regulation poses serious threats to fundamental rights, freedom of expression, and privacy. Worse still, it appears largely ineffective against organised crime.
The regulation can be easily circumvented by moving to private servers — as the draft only targets public digital service providers or platforms operating at scale — or by relying on encrypted messaging services on offshore servers. The EU cannot compel a Russian or Venezuelan server to scan European users' messages.
Ethically, legally, and technologically, the proposal is deeply flawed.
In an open letter published on 10 September, over 600 scientists from 35 countries issued a stark warning. Despite several revisions since its 2022 debut — including amendments from the European Parliament — they denounced a proposal that “would create unprecedented capabilities for surveillance, control, and censorship, and carries an inherent risk of mission creep and abuse by less democratic regimes.”
“Ensuring the current security and confidentiality of digital communications and systems has required decades of concerted effort by researchers, industry, and policymakers. There is no doubt that this proposal fundamentally undermines the security and privacy protections that are essential to safeguarding the digital society,” the letter states.
The threat to privacy, confidentiality, and free expression is very real. The regulation would apply to all forms of communication — public, private, and end-to-end encrypted — whether via messaging apps like WhatsApp, Signal, and Telegram, platforms like YouTube, or social media networks.
Although the stated aim is to protect minors, nothing in the text guarantees the mechanism won't later be repurposed to monitor other types of so-called “illegal” content. What qualifies as “illegal”? The Digital Services Act (DSA), from which the CSAM regulation stems, gives a clue: hate speech, incitement to violence, terrorism — but also “threats to public security,” a vague category that could easily encompass “misinformation”, “disinformation”, or political dissent.
Even if the project overcomes opposition from smartphone manufacturers and app developers — many of whom reject the idea of installing software to scan all communications — nothing prevents authorities from expanding the scope later. Once the technological tools are in place, the risk of mission creep becomes real: censorship, manipulation, even political repression.
And there is precedent. The 2006 Data Retention Directive, originally introduced to fight terrorism, was used by some member states to pursue lesser crimes such as fraud — before it was struck down in 2014.
So how did we get here? The regulation originated with the European Commission and is now being examined by both the European Parliament and the EU. It is championed by Home Affairs Commissioner Ylva Johansson, amid lobbying efforts and blatant conflicts of interest.
The Commission worked closely with Europol, the European law enforcement agency. Meeting minutes obtained by the Balkan Investigative Reporting Network reveal that Europol pushed for “unfiltered access” to data obtained through scanning, aiming to train AI algorithms for use in broader criminal investigations. These revelations triggered an inquiry by the European Ombudsman in 2024 1, among other ongoing investigations tied to the CSAM proposal.
But the links — and potential conflicts — of interest don’t end there. Commissioner Johansson has also maintained close ties with NGOs such as Thorn and the WeProtect Global Alliance, as well as corporations like Microsoft. Many of these organisations are not just silent on expanded surveillance powers — they actively advocate for them. Their influence is a growing concern: members of Johansson’s cabinet, including Antonio Labrador Jimenez, sit on WeProtect’s board. Both Thorn and WeProtect have received direct or indirect funding from the Commission and enjoyed “unusual” access to decision-making meetings.
What decisions were made? At what level? And with what degree of independent expertise? “These detection technologies remain unproven,” warn the scientists. “Despite serious doubts about their effectiveness 2, there has been no public discussion, analysis, or evaluation to justify the Commission’s approach 3. This lack of transparency blocks informed, democratic debate and endangers the digital security of society in Europe and beyond.”
With just one month left before a crucial vote — member states must take a position by 14 October — the political tide may be shifting. France has led the charge, backing measures that critics say amount to mass surveillance. But Germany, Luxembourg, and Slovakia have recently joined the growing bloc of opponents, bringing the total to six countries.
For the regulation to pass, it must win a qualified majority: support from at least 15 member states representing 65% of the EU population — before it goes to the Council of the EU. To date, 15 countries support the proposal, including France, Spain, and Italy. Six have come out against it, while another six remain undecided.
These investigations led to findings of maladministration, with recommendations aimed at improving transparency and conflict-of-interest procedures. While the European Ombudsman — Emily O'Reilly until February 2025, succeeded by Teresa Anjinho — has no binding powers, her recommendations can prompt institutional reforms.
"The rate of false positives is significant. In Ireland, only 20% of the reports received in 2020 by the National Center for Missing & Exploited Children were actually related to child abuse. In Germany, 34% of reports concerning child sexual abuse material were false positives, according to the Bundeskriminalamt, the Federal Criminal Police Office.
The Commission denies any bias, claiming the CSAM proposal is 'technologically neutral' and focused solely on child protection.