Apple and Google are under scrutiny after reports found that their app stores continue to host applications capable of generating altered images that depict individuals without clothing, despite existing platform policies prohibiting such content.
An investigation by the Tech Transparency Project identified dozens of such apps across both platforms. The report found 55 apps on Google Play and 47 on Apple’s App Store that can digitally alter images of women to produce partially or fully nude results.
According to the findings, these apps have collectively been downloaded more than 705 million times worldwide and generated approximately $117 million in revenue. The report states that both companies receive a share of this revenue through app store fees.
The investigation also found that many of the apps appeared to conflict with the companies’ own rules. Google’s policies prohibit apps that depict or simulate nudity or objectify individuals, while Apple’s guidelines restrict apps that produce offensive or explicit material. Despite these policies, the apps were available for download and in some cases accessible to younger users based on assigned age ratings.
Search and recommendation systems within app stores were also cited as a factor in visibility. The report found that searching terms such as “nudify” returned promoted apps and autocomplete suggestions directing users toward these tools.
Testing conducted as part of the investigation showed that several apps were able to generate altered images using basic prompts and free features. In some cases, applications produced content depicting individuals in various states of undress without restrictions.
Separate reporting indicates that the issue extends beyond individual apps and reflects broader challenges in moderating AI-generated content. The use of generative tools to create non-consensual altered images has increased with the availability of automated image editing technologies.
Following the publication of the findings, both companies took steps to remove some of the identified apps. Apple removed a number of applications, while Google suspended several others, although some remained available after initial enforcement actions.
The report states that the presence of these apps highlights gaps in app review processes and enforcement systems, particularly as AI-based tools evolve. It also notes that some apps were accessible through both platforms simultaneously, indicating overlapping distribution channels.
The companies have stated in previous policies that they aim to ensure user safety and restrict harmful content, while continuing to review applications submitted to their platforms.