Denmark, which currently holds the Presidency of the Council of the European Union, has stepped back from advancing its “chat control” proposal. The plan would have required technology companies to scan private messages for child sexual abuse material. After growing resistance from several member states and privacy advocates, Denmark now claims it will no longer push for the controversial proposal.
The proposal, known formally as the Regulation to Prevent and Combat Child Sexual Abuse, was introduced by the European Commission in 2022. It aimed to make content scanning mandatory for messaging apps, cloud storage services, and other communication platforms. Under the draft law, even encrypted services would have been required to use automated systems to identify illegal content in private messages. The measure was intended to strengthen child protection online, but it quickly became one of the most controversial digital initiatives ever discussed within the European Union.
From the start, critics said the plan was incompatible with the right to privacy. Germany, Austria, Poland, and the Netherlands were among the countries that refused to support it, warning that the proposal could undermine end-to-end encryption and open a path to generalised surveillance. Privacy advocates also said the technology needed for message scanning would create new vulnerabilities in digital infrastructure. Once a backdoor exists, they argued, it can be exploited not only by law enforcement but also by cybercriminals or hostile governments.
Several prominent voices in the privacy community raised concerns about the implications of mandatory scanning. The Signal Foundation, which operates the encrypted messaging app Signal, stated that it would leave the European market rather than compromise its encryption standards. European Digital Rights (EDRi) called the plan an unprecedented intrusion into private life, while other civil society organisations, such as Access Now and the Electronic Frontier Foundation, said it would erode public confidence in secure communication tools that are essential for journalists, activists, and ordinary users.
Beyond the privacy debate, experts questioned the accuracy and reliability of the technology itself. Detection systems based on artificial intelligence can misidentify legitimate content, potentially resulting in false reports or investigations. Opponents argued that this would create legal and reputational risks for users while flooding authorities with inaccurate data. They also noted that the proposal did not clearly define how evidence collected from scanning would be handled or stored, raising concerns about the misuse of personal information.
Danish Justice Minister Peter Hummelgaard confirmed in late October that the government had reconsidered its position. While Denmark remains committed to fighting child sexual abuse online, he said the Presidency now recognises that a mandatory scanning system cannot move forward without a stronger consensus among EU member states. Instead, Denmark will focus on extending the current voluntary system, which allows companies to detect and report harmful material without a legal obligation to do so. The temporary framework, introduced in 2021, will remain in force until April 2026.
For Denmark, the decision to slow down the legislative process reflects a practical response to political and technical reality. Without broad support in the Council, advancing the proposal risked deepening divisions among member states and drawing criticism from civil society at a time when digital trust is already fragile.
The Danish shift does not end the discussion about how to address online child exploitation. Instead, it delays a decision on whether the European Union will eventually make scanning mandatory or continue to rely on voluntary cooperation from technology companies. Once the existing framework expires in 2026, EU institutions will have to agree on a new approach or risk losing a key legal mechanism that currently allows platforms to report abuse material.
Industry representatives have urged policymakers to pursue alternatives that do not weaken encryption or endanger privacy. These include improving collaboration between companies and law enforcement, creating stronger reporting tools for users, and investing in prevention and education programs. Privacy experts argue that any new legislation must include judicial oversight and technical safeguards to ensure investigations remain targeted rather than indiscriminate.
For now, Denmark’s decision is viewed as a pause rather than a reversal. The issue is expected to return to the EU agenda under future presidencies, especially as governments face continued pressure to act against online child exploitation. The outcome will likely depend on whether a compromise can be found that both protects children and preserves the confidentiality of digital communication.
Denmark’s retreat highlights how difficult it is for lawmakers to regulate online safety without undermining encryption and data protection. The debate has become a test case for Europe’s approach to digital rights, showing that even well-intentioned policies can clash with the fundamental principles of privacy and security. As negotiations continue, the European Union faces the ongoing challenge of protecting its citizens online while upholding the trust that secure communication depends on.
