The United Kingdom is advancing measures that could hold senior technology executives personally liable if platforms fail to remove certain types of harmful online pornography, according to reporting and official statements.
The proposals form part of broader updates to online safety legislation, including amendments linked to the Online Safety Act 2023, which imposes a duty on platforms to address illegal and harmful material.
Under the proposed changes, executives at technology companies could face criminal penalties, including potential prison sentences, if their platforms do not act on non-consensual intimate images or similar content within required timeframes.
The measures focus in part on material such as “revenge porn” and AI-generated explicit images shared without consent. Government proposals require platforms to remove such content promptly after it is reported, shifting responsibility toward service providers and their leadership.
The initiative follows recommendations from the Independent Pornography Review, which examined risks associated with online adult content. Baroness Gabby Bertin, who led the review, stated that individuals appearing in pornographic material may face risks including exploitation and coercion, and that platforms should ensure participants are adults and have provided consent.
Additional legislative efforts linked to the same policy area include proposals to criminalise certain categories of pornographic content and expand enforcement powers. Recent parliamentary activity has addressed content considered harmful, including specific depictions that lawmakers said could normalise abusive behaviour.
The enforcement framework builds on existing requirements under the Online Safety Act, which allows regulators to impose fines of up to £18 million or 10% of global turnover on companies that fail to meet obligations.
Regulatory oversight is carried out by Ofcom, which has the authority to investigate compliance and enforce penalties. The regulator has previously opened investigations into platforms over concerns related to explicit and non-consensual imagery.
The proposals remain subject to legislative processes and further refinement, including defining enforcement thresholds and the scope of executive liability.