THE EUROPEAN UNION WANTS TO IMPOSE THE “CHAT CONTROL” PLAN.

SAY NO TO: The surveillance of your private messages
Online petition
0
signers

"Chat Control", what is it?

What is “Chat Control”?

Introduced in 2022, this EU draft regulation could, as early as 2025, impose an unprecedented surveillance system on 450 million Europeans. Framed as a fight against child sexual abuse material, it would require each member state to deploy tools able to search our private communications, ushering in an era of permanent control.

Basis: proposal COM(2022) 209.

The method: total surveillance

Apps and device makers would be forced to embed systems that scan in real time all our messages, photos and videos. No more distinction between suspects and ordinary citizens: everyone would fall under the microscope of automated control.

Mass, mandatory detection.

“Client-side scanning”: built-in spyware

This technique spies on your communications even before they are sent, including on WhatsApp, Signal, Telegram or Proton Mail. An algorithm compares your content against global databases: a single false positive could be enough to flag you to the authorities. It’s the end of true end-to-end encryption.

Analysis before encryption / sending.

An assault on your privacy

With “Chat Control”, no message is truly private. It opens the door to self-censorship, algorithmic errors and breaches of professional secrecy. Doctors, lawyers, journalists—no one will be able to guarantee confidentiality.

Privacy & freedoms at risk.

A jackpot for tech giants

Microsoft, Thorn and others are already eyeing this colossal market for automated surveillance. Europeans’ data could end up hosted on U.S. clouds, undermining our digital sovereignty in favor of powerful lobbies.

Private technologies & dependency.

A dangerous precedent

Imposing surveillance of all conversations creates a universal censorship tool. Today in the name of protecting children, tomorrow to track any opinion deemed inconvenient. Safeguards? Absent or fragile.

Towards limitless surveillance.

"Chat Control", Questions / Answers

What exactly is “Chat Control”?
Definition

“Chat Control” is a draft EU regulation (proposed by the Commission on 11 May 2022) that would set a legal framework to detect, report and remove online child sexual abuse material (CSAM). In its most controversial form, it would require messaging apps and online services to automatically scan what users send — texts, images, videos — including conversations meant to remain private and end-to-end encrypted. Technically, it shifts surveillance to the source: on your device itself, before encryption protects your messages (“client-side scanning”, or CSS). Data-protection authorities and the Council’s Legal Service have flagged this approach as problematic because it amounts to generalized searches without prior individualized suspicion.

How does client-side scanning (CSS) work — and why is it risky?
Technical

In practice, the messaging app embeds a module that computes a “fingerprint” (perceptual hash) for each image or analyzes each message before sending, then compares the result against reference databases (CSAM hash catalogs, pattern lists, ML models). If the algorithm “believes” it matches prohibited content, it creates a report and triggers an alert chain to a designated center or the authorities. The issue is both principled and practical: in principle, you’re scanned constantly without a connection to an investigation; in practice, these systems are neither perfect nor neutral. Research shows structural vulnerabilities: attackers can mass-produce false positives, poison hash sets, or hide a “secondary objective” (e.g., targeted face recognition) inside a hashing scheme ostensibly limited to CSAM. Errors do happen and have human consequences (account closures, unwarranted investigations). We saw this as early as 2022: Google reported fathers who had taken medical photos of their children at a doctor’s request; accounts were closed and suspicion lingered despite cases being dismissed.

What are the threats to fundamental rights?
Rights & freedoms

The core problem concerns the secrecy of correspondence and data protection. The EU Charter of Fundamental Rights (Arts. 7 and 8) and the ECHR (Art. 8) protect privacy and the integrity of communications. The CJEU has repeatedly struck down generalized, indiscriminate surveillance mechanisms — even for legitimate aims — because they are not targeted or proportionate (Digital Rights Ireland, Tele2 Sverige). In short: you cannot screen an entire population “just in case”. The EDPB and EDPS have explicitly warned that the proposal, as drafted, would effectively sweep up almost all communications, chilling free expression (journalists, NGOs, protected professions).

Is it compatible with EU law?
EU legal framework

Several strong signals suggest it is not, if it mandates indiscriminate detection. The Council’s Legal Service has underlined the impact on fundamental rights of scanning inter-personal content without targeting. Parliament (via the LIBE committee) voted in November 2023 to reject mass scanning and explicitly protect end-to-end encryption. That doesn’t mean scanning is impossible, but it must be strictly targeted, proportionate, and subject to robust judicial oversight. Any solution that circumvents encryption by systematic client-side scanning collides with those principles.

What is a European regulation?
Institutions

A regulation is an EU legislative act (Art. 288 TFEU) that is general in scope, binding in all its elements, and directly applicable in every Member State. Unlike a directive, it does not need national transposition to take effect: it applies as-is, at the same time and in the same way everywhere. The EU uses it when it wants a uniform rule for the whole market and all citizens.

Who votes on a European regulation?
Procedure

The Commission proposes the text. It is then co-decided by the European Parliament (elected MEPs) and the Council of the EU (ministers of the Member States) under the ordinary legislative procedure. In practice this means successive readings, amendments and, often, “trilogues” between Commission–Council–Parliament. If Parliament and Council don’t adopt the same text, it doesn’t pass. The final outcome is therefore a political compromise between governments and elected representatives.

Where does the legislative process stand today?
Timeline

After the 2022 proposal, the file hit several political roadblocks in 2024. Since July 2025, under the Danish Council presidency, it has picked up speed again: specialist media report a political aim to wrap up a deal in the autumn, with a date mentioned for a vote around 14 October 2025 and positions hardening in September. Nothing is settled: Parliament’s position protects E2EE, and in the Council the balance shifts with national coalitions; some capitals have already formed “blocking minorities” in the past. The calendar is accelerating, and technical trade-offs (opt-in scanning? limit to known images? exclude grooming?) remain contentious.

EU countries’ positions (summer 2025)
Member States

The balance of power in the Council has shifted since July 2025. Under the Danish presidency, a new text — largely inspired by the 2024 Belgian and Hungarian compromises — puts back on the table mandatory scanning (including before encryption) and a risk categorization by service. An internal read-out of the “law enforcement” working party meeting of 11 July 2025, leaked by netzpolitik.org, sums it up: a majority of countries “can live” with the Danish proposal, a blocking minority remains, and a few key undecideds (Germany, France, Belgium, Finland, Luxembourg, Estonia, Czechia, etc.) carry a lot of weight.

Visually, the July 2025 map published by MEP Patrick Breyer shows at a glance which governments are “supportive”, “opposed/neutral”, or “undecided”. It’s an activist snapshot but useful to locate your country.

Countries explicitly “supportive” on 11/07 (or positive about the Danish text) … Italy, Spain, Hungary, Latvia, Lithuania, Cyprus, Croatia, Sweden (government support, parliamentary green light required), Denmark (author of the text). France said it was ready to “support in principle” the text, notably renewal of the transitional regime, which currently places it among the rather supportive. Portugal is “very positive” but keeps reservations about the encryption part.
“Opposed” countries (or formally critical on key points) … Poland (rejects mandatory scanning and inclusion of E2EE, flags cybersecurity risks and the invalidity of forced “consent”), Austria (bound by a firm parliamentary stance against CSS and weakening encryption), Netherlands (strong concerns over “detection orders” and CSS). Slovenia and Luxembourg express serious doubts about proportionality and client-side scanning.
“Undecided / under review” (or internally split) … Germany (new coalition still arbitrating, historically opposed to E2EE scanning), Belgium (“difficult” domestic politics on encryption after its 2024 compromise), Estonia (internal split security vs. data protection on CSS), Finland (text “rather problematic”, under review), Czechia (election season, position pending), Ireland (welcomes cybersecurity safeguards but remains cautious), Slovakia (positive on ambition but open to counter-arguments), Romania (not cited in that specific read-out). These nuances matter: the Council Legal Service still considers CSS contrary to fundamental rights, which weighs on the fence-sitters.

Bottom line: adoption hinges on a handful of capitals (Paris, Berlin, Brussels, Helsinki, Luxembourg, Prague, Dublin). And even among the “supportive”, several national parliaments could impose safeguards. For a broad overview, see the July 2025 map (indicative).

Positions of Big Tech and encrypted messengers
Stakeholders

Apple. Dropped in 2022 its CSAM detection plan for iCloud Photos (on-device scanning), welcomed by encryption advocates. Since then Apple has emphasized strengthening E2EE (Advanced Data Protection). This remains incompatible with generalized CSS; the company focuses on systemic security rather than AI scanning on devices.

Meta / WhatsApp. Consistent public stance: no backdoors and no imposed scanning that weakens E2EE. Will Cathcart has reiterated the app won’t “break” encryption — similar messages during UK/EU debates. Tech media note that client-side scanning undermines WhatsApp’s confidentiality claims.

Google. Defends E2EE (notably on Messages/RCS) and, more broadly, server-side hashing where lawful and proportionate, without damaging universal E2EE. European analyses recall the limits of technologies like PhotoDNA (useful for known content, not “unknown” nor grooming), which underpins arguments against generalized CSS.

Microsoft. Long-time promoter of PhotoDNA (detection via hashes of already-known content, not CSS). In the EU it argues for proportionate rules and provenance/traceability tools, not real-time scanning of all encrypted chats. PhotoDNA’s scope is not equivalent to CSS for “unknowns”.

Signal. Hard line: would rather exit a market than weaken E2EE. Meredith Whittaker has said Signal will not integrate mandatory CSS or any “exception” to E2EE.

Threema. Firmly opposes the “Chat Control” proposal, warning of system-wide insecurity introduced by any CSS.

Element / Matrix. Warns against a return of mass scanning and the damage to the open-source ecosystem (self-hosted servers, federation), where imposing client/server scanners would be unrealistic and dangerous.

Proton (Mail/Drive/Pass/VPN). Welcomes Parliament’s 2023 stance excluding E2EE from detection and deems CSS technically and legally untenable.

Telegram. No clear EU-level position on CSS; its encryption isn’t enabled by default for all chats, making it a hybrid case in the debate.

Overall trend: services that are truly E2EE reject mandatory CSS. Among “Big Tech”, the public line is to protect E2EE and avoid a generalized client-side scanning mechanism.

What concrete consequences for you, tomorrow?
Impacts

If generalized client-side scanning were introduced, every device would become a permanent inspection post. Your messages and photos would be sifted before encryption, and errors could trigger reports — account closures, blocking of essential services (mail, cloud), even police investigations. Documented cases show how such errors can wreck a digital life within hours. Beyond that, weakening the E2EE ecosystem opens attack surfaces: more code, more access points, more risk. For protected professions (lawyers, doctors, journalists), it directly undermines professional secrecy; for victims of violence or activists, it removes safe spaces. At industry scale, trust in European digital services would suffer if the EU became the first democratic region to impose such mass scanning.

Can France “ban” Chat Control if the EU adopts it?
EU law

No, not unilaterally. If adopted, a regulation overrides any conflicting national rule (primacy of EU law) and applies directly. A State cannot carve out a national exception without facing infringement proceedings by the Commission and, ultimately, a CJEU ruling. Before adoption, however, France can weigh in at the Council, build coalitions to amend or block; after adoption, it can challenge before the CJEU (annulment action), or, if the text leaves room, implement maximum safeguards within the permitted flexibility (strong judicial oversight, public audits, exclusion of CSS, etc.). But a frontal national ban on an obligation laid down by a regulation would run counter to EU law.

Is professional secrecy protected?
Secrecy & ePrivacy

Yes. Under European law the secrecy of communications and the confidentiality of attorney-client exchanges are at the core of the right to privacy (Charter, ECHR Art. 8). The ECtHR affords “enhanced” protection to legal professional privilege: without confidentiality, the right to defense and investigative journalism are illusory. Likewise, ePrivacy requires confidentiality of electronic communications. Client-side scanning inserts a mechanism that inspects messages before encryption, mechanically weakening that confidentiality — including for protected professions. The EDPB/EDPS have warned that imposing detection tech on messengers may lead to near-generalized sweeping of communications with worrying error rates. In other words, even with on-paper “exemptions”, a generalized on-device scanner is hard to reconcile with professional secrecy as protected in Europe.

“I’ve got nothing to hide” — why oppose it?
Arguments

Because a fundamental right doesn’t depend on what you have to hide, but on the power the State or companies can wield over your life. Pre-emptive surveillance of everyone dilutes the presumption of innocence, chills speech, and multiplies errors. Detection technologies also “find” innocents: false positives trigger reports, account closures, investigations — with very real harm. Even for policing effectiveness, flooding services with dubious alerts diverts resources from genuine cases. European data-protection authorities note that error rates for detecting “new” content remain concerning. And confidentiality also protects others: victims of abuse, activists, medical and legal professionals, whistleblowers, journalists. Weaken one, weaken all.

“Protecting children”: a legitimate goal — but is this effective?
Effectiveness

Everyone shares the goal of protecting children. The question is real-world effectiveness and societal cost. Technical analyses indicate that CSS dramatically widens the net, generates heavy false positives, and can be bypassed by determined offenders (closed networks, tools outside the EU, adversarial ML). Each false positive consumes investigative time and delays genuine cases. More targeted approaches exist and work: more human and judicial resources, targeted infiltration and surveillance, focused international cooperation, stronger action at the source, education, and prioritizing truly actionable reports. In short, favor effective, proportionate tools over indiscriminate population-wide scanning.

What’s the link with the “ProtectEU” strategy and access to encrypted data?
Context

In June 2025, the Commission published a “roadmap” to ensure “lawful and effective access to data” for law enforcement. It includes a technology roadmap on encryption from 2026 and funding for decryption capabilities by 2030 (e.g., Europol). This strategic backdrop keeps political pressure on end-to-end encryption and helps explain the push to pass a text like CSAR in the short term. It’s crucial to remember strong encryption also protects hospitals, businesses, administrations and citizens against crime and espionage. Weakening it in the name of “lawful access” creates holes usable by malicious actors too.

Who controls these algorithms, and what safeguards exist?
Governance

At this stage, development and operation would largely rely on private actors: big firms (Apple, Meta, Microsoft, Google) and specialist companies (e.g., PhotoDNA/MS). Three major concerns follow. First, transparency: models, thresholds and training data are trade secrets, hence hard to audit. Second, dependency: the EU would outsource a sovereign function (detection) to mostly extra-EU providers, with unilateral updates and imported biases. Third, function creep: once the infrastructure exists, expanding the scope (terrorism, “disinformation”, fraud…) is tempting unless strict, enforceable safeguards are in place. Apple’s 2021 episode illustrates the risks: it announced an on-device CSAM scan, then backed down after backlash from cryptographers and regulators, precisely due to potential drift and the difficulty of guaranteeing a tightly bounded use.

Why do some States oppose (or stall) it?
National positions

Governments and data-protection authorities — notably in Germany, the Netherlands and elsewhere — have raised fundamental objections: incompatibility with fundamental rights, technical insecurity, operational ineffectiveness (too many alerts and false positives). Germany’s Justice Minister, for instance, has said “chat control” has no place in a rule-of-law state, and the Dutch chamber publicly slowed earlier versions. The landscape is fluid, though: some countries that stalled in 2024 now support compromises re-introducing client-side scanning. Hence the importance of tracking your government’s stance in real time.

Positions of French MEPs
French MEPs

Parliament’s reference position remains the one adopted in November 2023 (LIBE + negotiation mandate): no mass surveillance, exclusion of E2EE from detection, targeted measures based on reasonable suspicion. This line was supported by a broad majority across groups (Greens/EFA, The Left, S&D and much of Renew). It contradicts generalized CSS.

After the 2024 elections, the 2024–2029 legislature renewed the French delegations (RN/Patriots for Europe, LR/EPP, Renaissance–MoDem–Horizons/Renew, PS–Place Publique/S&D, LFI/The Left, Ecologists/Greens). To date, most public positions follow group lines:
Greens/EFA and The Left (LFI): outright opposition to CSS and any weakening of E2EE; they backed the 2023 LIBE line.
S&D (PS/Place Publique) and Renew (Renaissance/Modem/Horizons): broadly aligned with LIBE 2023 (protect E2EE, targeted approach), though some MEPs may be “open” to very narrow measures on known content under judicial control.
EPP (Les Républicains): split between “security” advocates (in favor of detection obligations) and “rule-of-law/technical” concerns (proportionality and encryption security).
Patriots for Europe / ID lineage (RN, allies): heterogeneous depending on security/privacy files; no single French line in favor of mandatory CSS at time of writing.
ECR: generally supportive of stronger investigative tools, but not at the price of a generalized E2EE backdoor (internal nuances).

How can I raise awareness around me?
Take action

Start by explaining the mechanism without jargon: “They want to install software on our phones that reads our messages before they’re encrypted, to compare what we send against image databases and automated models. If the machine is wrong, it can report us by mistake.” Then the issue: “This isn’t a debate for or against protecting children; it’s about the method: scanning everyone all the time is disproportionate and ineffective. Offenders adapt; meanwhile, everyone’s security and privacy are weakened.” Finally the solution: “Fund people and courts, target investigations, cooperate across borders, take content down at the source, educate and prevent — without breaking encryption.” For a 30-second pitch:
“‘Chat Control’ is a scanner on our phones that reads messages before sending. It can be wrong and flag innocents. It breaks the secrecy of exchanges (doctors, lawyers, journalists) and creates holes for attackers. Let’s protect children with effective, targeted means — not by surveilling 450 million Europeans.”
In longer conversations, use concrete analogies: “Would you accept an agent opening all your letters ‘just in case’?” Remind people that, in Europe, confidentiality of communications and professional secrecy are protected rights — and that data-protection authorities have warned about errors and drift. Offer a clear action: sign the petition, write to your MEPs, discuss it at work and at school, and favor messengers that publicly oppose mandatory CSS.

Which countries oppose the project?

The European Union draft regulation “Chat Control”, led by Swedish European Commissioner Ylva Johansson and actively supported by the Danish Presidency of the Council of the European Union, is advancing rapidly.

France, Spain and Italy are among the countries that strongly support this text, while other Member States remain undecided or openly oppose it.

This proposal would establish massive and permanent surveillance of private communications across Europe.

It is urgent to alert our governments and Members of the European Parliament, and to make it clear that we reject such a scheme, which threatens our fundamental freedoms.

  • Opposé/neutre
  • Indécis
  • En faveur
image/svg+xml

#stopchatcontrol Join the campaign

Envie de vous lancer en cybersécurité 🔐 mais vous ne savez pas par où commencer ?
Je vous conseille TryHackMe - une plateforme d’entraînement pratique où vous apprenez en
piratant de vraies machines et en résolvant de vrais scénarios, directement dans votre navigateur.

Il y a…

#stopchatcontrol Nouvel épisode - Le combat continue 🚨
Un nouveau compromis danois sur #ChatControl vient d’être validé. Son article 4 réintroduit subrepticement le scan généralisé des communications, même chiffrées. Il impose aussi l’obligation de s’identifier pour ouvrir un…

[ 🇪🇺 EUROPE ]

🔸️L’Union européenne a renoncé à la mesure la plus controversée de son projet de loi contre la pédocriminalité, surnommée « Chat Control ».

Cette disposition prévoyait de scanner les conversations privées, y compris sur des messageries chiffrées comme WhatsApp…

Charger plus

Latest articles on Chat Control

EU: opposition to Chat Control grows, support remains strong

clubic.com

As a Council decision approaches, the “Chat Control” proposal continues to divide. It would scan conversations, including encrypted ones, to combat child sexual abuse material. Several states remain in favor, but opposition is widening. Messaging services and cryptography experts warn of risks to privacy and security.

These are the EU countries abandoning privacy to scan your encrypted messages

clubic.com

Several governments want to impose automatic scanning of all messages, even encrypted ones. France, Germany, Spain and Poland are among the supporters. Critics denounce a disproportionate system with false positives. The debate pits child protection against the secrecy of correspondence.

Signal isn’t leaving Europe… for now: what you need to know about Chat Control

frandroid.com

Signal confirms it will remain in Europe as long as the regulation is not adopted, while stressing that “client-side scanning” would be incompatible with end-to-end encryption. The article takes stock of the text’s status, the positions of Member States, and the possible consequences for WhatsApp, Signal or Telegram.

EU: Chat Control — does Europe want to scan all your private messages?

korben.info

The Danish presidency puts the project back at the center of the EU agenda. The idea: install a “sniffer” on the device to analyze each message before encryption. The author denounces a weakening of encryption and potential mass surveillance. NGOs, cryptographers and messaging vendors are speaking out.

Encrypted messages under control? The EU could soon force the scanning of private conversations

cryptoast.fr

Under the Danish presidency, ministers are to decide on possible “client-side scanning”. The floated compromise provides for on-device analysis before encryption. Council lawyers and rights advocates see it as disproportionate. The impacts on communications security are deemed major.

Fighting child abuse: the dangerous plan to surveil ALL private communications the EU intends to deploy

atlantico.fr

A critical analysis of a scheme that would subject private messaging to systematic filtering. Behind the stated goal lie risks to freedom of expression and the confidentiality of communications. End-to-end encryption would be weakened, setting a worrying precedent for fundamental rights. The debate remains intense within the Union.

Chat Control explained in video

Chat Control, Union Européenne

Share the petition on your social media networks!

Retour en haut