South Korean authorities have arrested an IT contractor accused of stealing more than 221,000 personal photos from schools, hospitals, and government-related institutions and using the images to create deepfake pornography and other illegal sexual content.

 

 

According to investigators, the suspect worked as an outsourced IT contractor with access to systems belonging to multiple organizations. Police allege he secretly copied large volumes of stored photos and personal data over several years while performing maintenance and technical support work.

Authorities say the stolen images were later used to generate non-consensual deepfake pornography targeting women and minors. Investigators reportedly uncovered approximately 405GB of illegal material during the case, including manipulated explicit images, hidden camera recordings, and child sexual abuse content.

The investigation began after the suspect accidentally left behind a USB storage device at a school where he had been working. Staff members reportedly discovered suspicious files on the device and alerted authorities, leading police to launch a broader forensic investigation into the contractor’s activities.

Police later searched the suspect’s residence and electronic devices, where they allegedly found additional stolen data and AI-generated sexual content. Authorities believe the operation may have been active for years before being uncovered.

Investigators said the stolen photos originated from a wide range of institutions, including educational facilities, medical organizations, and public-sector systems where the contractor had administrative or maintenance-related access. The scale of the breach has raised concerns about internal security controls and third-party contractor oversight within South Korean institutions.

Officials have not publicly disclosed the total number of victims potentially affected by the incident. However, given the volume of images involved, authorities believe thousands of individuals may have had personal photos exposed or manipulated without their knowledge.

The case has intensified public concern in South Korea over the growing use of generative AI tools to create realistic non-consensual sexual content. Advances in deepfake software have made it increasingly easy to generate explicit fake imagery using ordinary photos pulled from school records, social media accounts, or internal databases.

Authorities are continuing to analyze the seized files to identify victims and determine whether any of the material was distributed through online communities or encrypted messaging platforms.

The suspect now faces multiple charges related to privacy violations, illegal pornography production, and possession of child sexual abuse material. South Korean investigators say additional charges may follow as the forensic investigation continues.

Leave a Reply