Dutch regulators and law enforcement agencies are calling for a broader ban on AI tools that generate nude images, arguing that current European Union rules do not go far enough, according to a joint statement.
Authorities, including the Netherlands Authority for Consumers and Markets, the Dutch Data Protection Authority, and the Dutch National Police, have responded to recent European legislation that prohibits applications designed to “undress” individuals or generate child sexual abuse material. The measures were approved as part of updates to the EU’s AI regulatory framework.
The European Parliament backed the ban on so-called nudify applications, which are designed to create non-consensual explicit images using artificial intelligence. The legislation is expected to prohibit such tools within the European Union once it completes the approval process with member states.
Dutch authorities stated that the current framework still allows certain exceptions. These include cases where the images are generated using artificial characters or where consent is claimed from the individual depicted. Regulators argue that these exceptions limit the effectiveness of the ban and leave gaps in enforcement.
In a joint statement, the agencies said that action under the existing rules mainly targets individuals who create or distribute explicit material, rather than the tools used to generate it. They stated that this approach does not address the underlying issue of how such content is produced and shared.
The authorities are seeking discussions with government ministries to examine how a more comprehensive ban could be implemented at the national level. The proposal would remove current exemptions and apply restrictions to all forms of nudify tools, regardless of how the images are generated or whether consent is claimed.
Regulators also stated that existing laws may allow action against developers in certain cases, particularly where content involves child sexual abuse material. However, they said a broader prohibition would provide clearer legal grounds for enforcement against the tools themselves.
The agencies indicated that, until further changes are introduced, they will continue to process individual reports and use current legal frameworks to address incidents involving such applications. They also noted the importance of informing younger users that creating or sharing such content can be punishable under existing laws.