Chat Control: What it is, where it is, and how it can affect encryption

Last update: 14th October 2025
  • There are two distinct frameworks: 1.0 (voluntary and in place) and 2.0 (proposed with mandates for detection and debate on encryption).
  • Chat Control 2.0 is not approved: the Council must establish a position, and then a trilogue and final vote will follow.
  • Pre-encryption scanning poses technical and legal risks; legal experts warn of incompatibilities with fundamental rights.

Generic illustration about Chat Control and privacy

The idea that someone could read your private messages isn't a pleasant one, especially when we're talking about conversations protected with end-to-end encryption. Yet, European institutions are debating a regulatory package that could force messaging and email services to scan content for child sexual abuse material. This umbrella of measures is popularly known as Chat Control, and actually encompasses two distinct frameworks with very different trajectories and effects.

To understand the mess, it's helpful to separate the present from the future. On the one hand, there's Chat Control 1.0, already in place and voluntary for platforms; on the other, the Chat Control 2.0 proposal, which could make analysis mandatory upon detection. The second text is still being processed and it could still change, but the noise has erupted on social media due to viral videos claiming it was approved or imminent, creating confusion about what has actually been decided.

What exactly is Chat Control and why are there two versions?

The Chat Control moniker refers to two European initiatives that aim to combat the dissemination of child sexual abuse material on digital services. The first, Regulation 2021/1232 (dubbed Chat Control 1.0), passed in 2021, allows messaging and email providers to voluntarily detect such content, without requiring them to do so or giving them any additional legal basis beyond the temporary repeal of certain ePrivacy restrictions.

The second is Regulation 2022/0155, known as Chat Control 2.0 or the Regulation on preventing and combating child sexual abuse. This is a proposal that could impose a duty of detection on platforms and content removal, with detection orders issued by competent authorities when there is a significant risk. This is the crux of the debate: how this obligation can be reconciled with end-to-end encryption and the fundamental rights of European citizens.

Representative graph of content detection in chats

Chat Control 1.0 vs. Chat Control 2.0: Key Differences

Chat Control 1.0 allows providers to review communications for child sexual abuse material, but on a voluntary basis and with safeguards. Article 3 requires technologies that are as minimally intrusive as possible and techniques that do not reconstruct the substance of the conversations, but rather look for patterns or signals associated with abuse. Furthermore, in April 2024, the European Parliament approved extending this derogation until April 3, 2026, to allow for a stable long-term framework.

The Chat Control 2.0 proposal takes a more ambitious turn: in certain cases, and following an order from an independent judicial or administrative authority, Platforms would have to detect, report and remove CSAMAccording to the technical interpretation shared by industry associations, this could involve analyzing content before encrypting it on the user's device, known as client-side scanning. This conflicts with the core promise of end-to-end encryption.

There's an important caveat: both initiatives focus detection on providers, not on European institutions monitoring conversations. According to the industry, no EU institution would have access to private messages under these frameworks; when a platform detects signs of CSAM, it must notify law enforcement and judicial authorities or entities with a public interest in child protection.

Status of the procedure: what has been decided and what has not

The Chat Control 2.0 proposal is not in force. As of early October 2025, the text awaits the completion of its first reading In the Council; Parliament has already stated its position to safeguard the balance between protecting children and avoiding widespread surveillance. The Council's position would be followed by trilogues and a final vote in Parliament's plenary session before its official publication, if approved.

In this context, messages have been amplified that assumed the Council would activate the rule on October 14, 2025. The Danish presidency has prioritized the file and the Justice and Home Affairs Council is scheduled for the 13th and 14th, but This step does not directly trigger the application of the rule.European sources have clarified that it will not "enter into force" on that date and that the timeline for adoption and entry into force is currently unpredictable.

  Photocall TV: over 1000 free live channels

Counts of support and rejection among states have also gone viral. Spain, Portugal, France, Italy, Croatia, Hungary, Bulgaria, Cyprus, Malta, Iceland, Denmark, Sweden, Latvia and Lithuania have shown a willingness to move forward, while Austria, Belgium, Czech Republic, Finland, Germany, Luxembourg, Netherlands, Poland and Slovakia They have been highly critical of any formula that weakens encryption. Some countries are still undecided in some votes (for example, Greece, Romania, Estonia, and Slovenia), and the German position is considered particularly decisive.

How detection would work: orders, scope and limits

The backbone of the proposal is the detection order system. It could only be imposed on a service when there is a significant risk misuse for child sexual abuse material or child deception. The order would be issued by a competent judicial authority or an independent administrative authority, with a limited duration and proportional oversight.

To minimize impacts, the Commission has stressed the need for state-of-the-art technologies that are incapable of extracting information other than that which is strictly necessary, and with specific human supervision to filter false positivesFurthermore, the draft law considers the detection of grooming to be the most intrusive because it requires analyzing text, so its imposition would require particularly strong justification. In some states' proposals, it would only apply when one of the participants is a minor.

A particularly sensitive issue is the handling of services with end-to-end encryption. Operationally, pre-encryption detection would be performed on the user's device to compare images, videos, or, according to previous versions, text against fingerprint databases or detection models. Many experts equate this local inspection with creating a back door, as it opens an additional channel through which content that should be encrypted could be analyzed and flagged before traveling.

Who reviews, who reports and what role does the new European center play?

One of the most common misconceptions on social media is that "the European Union will read all messages." This isn't the case. In both contexts, they are the providers who perform the detection with technical tools. If they identify illegal content, they must refer it to the competent authorities or accredited organizations for processing.

The Commission proposal also provides for the creation of a EU Centre on Child Sexual AbuseIts purpose is to centralize the reception of reports, improve cooperation between platforms, civil society, and authorities, and consolidate transparency and audits of content search and removal tools and processes. This seeks to avoid duplication, improve response times, and reduce the risk of leaks during the exchange of sensitive information.

From an industry perspective, if Chat Control 2.0 were to be adopted, in addition to the police and prosecutors, the Coordinated Authorities designated by Member States to ensure implementation could be alerted, although that detail remains open in the negotiations. What the sector does emphasize is that the EU would never access private chats on its own.

Privacy, encryption, and technical risks: the most serious criticisms

Digital rights organizations, security experts, and legal experts have criticized the core of the proposal. They argue that client-side scanning undermines the essence of end-to-end encryption, with the resulting increased risk of data breaches. If a way to examine content before encryption is built in, this could be exploited by malicious actors, from cybercriminals to hostile governments.

An assessment commissioned by the European Parliament was particularly forceful: there is currently no technology capable of detecting known and new CSAM on a large scale. without generating high error rates that impact all traffic on a platform. The study also points to collateral effects, such as the risk of classifying images shared consensually by adolescents as illegal.

Empirical evidence does not support optimism. In Ireland, for example, 852 of 4.192 reports (20,3%) turned out to be exploitative material and 471 (11,2 percent) were false positives, figures that illustrate the difficult balance between accuracy, reporting volume and human verification resources.

On the legal front, the Council's Legal Service questioned the proposal's compatibility with fundamental rights such as privacy and data protection, also recalling the EU Court of Justice's case law against widespread data retention. The European Data Protection Supervisor and the European Data Protection Board They warned in a joint opinion of the risk of drifting towards a widespread and indiscriminate de facto scanning of the content of almost any electronic communication.

  Types of encryption: Symmetric, asymmetric and their differences

Court rulings also weigh heavily. In February 2024, the European Court of Human Rights held in an unrelated case that requiring degraded end-to-end encryption cannot be considered necessary in a democratic society. This legal wind reinforces those calling for encryption to be reinforced in the final rule.

Who supports, who rejects and how politics moves

The cause of protecting children has strong supporters, from child protection activist groups to Commission departments and MEPs of different colours, who see a mandatory framework as a response to the inadequacy of voluntary actionThey argue that there are gaps due to variability in corporate policies, dependence on reports from the United States, and the lack of a flexible European reporting and investigation process.

The opposition, on the other hand, draws a red line on encryption. In the European Parliament, Greens, social liberals, conservatives, and pirate parties have raised their voices in a curious cross-party convergence. The idea that protecting children is a matter of repetition is repeated. does not require mass surveillance and that there are less intrusive tools: strengthening police resources, improving cross-border cooperation, pursuing illicit markets, demanding greater security by design from platforms, and working on education and prevention.

The map in the Council is divided. Spain is in the bloc pushing the file and has defended that the authorities can access to data in the fight against abuses, while Poland, the Netherlands, Finland, and Austria remain critical. France has wavered, Belgium and Estonia have doubts about implementation, and Germany has gone from opposing to appearing undecided after changes in government. Denmark, upon assuming the rotating presidency in July 2025, reactivated negotiations and included the regulation as a high priority.

In public opinion, there are surveys like those by YouGov that report clear majorities against widespread scanning. This social climate has translated into waves of pressure to MEPs, especially during 2023 and 2024, when the Council postponed several votes due to lack of agreement.

Controversies: lobbying, political communication and the role of certain organizations

The process has been accompanied by controversy. Transnational journalistic investigations have delved into the role of the American Thorn Foundation, driven by public figures, which actively advocates for the deployment of detection tools and markets artificial intelligence software to identify CSAM. Its engagement with the Commission and the commissioner responsible for the case has raised eyebrows regarding conflicts of interest and pressure from commercial interest groups.

The Commission's communication strategy has also been criticized, including microsegmentation campaigns and language perceived as ambiguous, avoiding terms like "mass scanning" in favor of formulas for targeted detection and proportionality. Digital rights organizations say they requested formal meetings that didn't always receive a response, something that has fueled the narrative of opacity.

In the wake of the debate, initiatives have emerged in other countries. In the United States, for example, political campaigns against privacy protections are being denounced under the umbrella of combating abuse, and in the United Kingdom, there have been temptations to demand backdoors. The EU risks appearing to be hypocrite If he promotes measures that he criticizes abroad, say voices from civil society.

What changes for the average user: practical effects and framework in Spain

If the proposal were to go ahead in its toughest version, messaging, email, or even chat services integrated into games could be forced to install automatic detection systems images, videos, or links. In end-to-end encrypted services, this would involve analyzing certain content before encrypting it on your device, resulting in a larger attack surface and confidentiality issues if something goes wrong.

In Spain, Article 18 of the Constitution guarantees the confidentiality of communications except by judicial authorization, and platforms have a duty to cooperate when warranted. With end-to-end encryption, this is not possible for platforms today. decrypt chats Even if there is order, because they don't safeguard user passwords. A regime that requires detection before encryption would change the technical and legal architecture, with a far from trivial constitutional framework.

In addition to the technical risks, there are concerns about the deterrent effects on the exercise of rights such as freedom of expression or political activism. If users perceive that every image or audio can be preemptively analyzed, self-censorship can increase, and with it, the fear of sharing even legitimate content.

  El Corte Inglés suffers a hack that exposes customer data

Myths and realities of the viral video circulating on social media

In November 2024 and again in 2025, videos circulated claiming that Chat Control "had already passed all phases" and would "go into effect on October 14." That message It is incorrect: Proposal 2.0 is still being processed, there are still stages to go, and although the Council is deliberating and establishing a position, this does not mean immediate implementation.

Another recurring claim is that "the EU will have access to all private messages" and that only X and Telegram would refuse. In reality, the texts put the burden of detection on the platforms, and any pre-encryption scans would be subject to specific detection orders. Although several companies have expressed their reluctance to compromise encryption, there is no closed or definitive list of which services would or would not accept a potential European mandate.

Where does all this come from: minimum chronology to situate ourselves

The Commission submitted its proposal in May 2022, following years of sharp growth in CSAM reports to entities such as the US National Center for Missing and Exploited Children. In 2023, the House Civil Liberties Committee approved the proposal. explicitly protect encryption and eliminate indiscriminate scanning, and during 2024 the Council postponed the vote on several occasions due to a lack of majorities.

In 2024, the Commission worked on revised versions, and leaked drafts circulated that attempted to narrow the scope (e.g., by focusing detection on multimedia and links and less on text and audio). The arrival of the Danish Presidency of the Council In July 2025, he reactivated the file and set the ambition to accelerate a political decision in the fall, something that has brought the debate back to the headlines and onto the public agenda with intensity.

Proposed proportionality measures and proposed safeguards

In an attempt to overcome the necessity and proportionality test, the Council's drafts insist on limiting the detection orders in time and scope, requiring objective risk assessments, preferring the least intrusive technologies possible, and requiring audits and human oversight of flagged cases to reduce errors and biases.

It is also proposed that orders cannot exceed limited periods (for example, up to twelve months), that they be reviewed, and that the impact on rights be documented. avoid general supervision obligations It is the declared objective, and it seeks to move within the field of specific and justified orders.

Relationship with other digital standards and debates

This bill does not exist in a vacuum. It coexists with the Digital Services Act, with data and privacy regulations, and with national frameworks that seek to balance security and fundamental rights. Consistency between these pieces of legislation will be key. key so that contradictions or grey areas do not arise that open the door to abuse or legal uncertainty.

  • Digital Services Law and their due diligence obligations
  • Online Safety Act of 2023 and parallel debates in other countries

Finally, it is important to remember that, even when there is a will to protect children, Not everything is solved with technologyPrevention, digital education, strengthening investigative capacities, and cross-border cooperation are critical elements that experts recommend strengthening alongside any detection framework.

Overall, Chat Control is a minefield where a just cause coexists with very serious risks to everyone's privacy and security. 1.0 allows voluntary screening with safeguards and is extended until 2026; 2.0, on the other hand, is an evolving proposal that introduces detection orders with the potential to affect encryption, with widespread opposition from legal experts, technologists, and privacy organizations. Between shifting political agendas, social pressures, and technological concerns, the final decision will shape the next decade of our communications, so it's important to closely monitor the process and demand that any steps taken respect the core of our rights.

WhatsApp answering machine
Related article:
WhatsApp Auto-Reply: Complete Guide to Setting It Up