The State of West Virginia filed a lawsuit against technology company Apple Inc. in February, accusing the company of failing to detect and report child sexual abuse material (CSAM) stored and shared through its iCloud cloud storage service. The legal action was brought in Mason County Circuit Court and seeks statutory and punitive damages, along with orders requiring Apple to adopt more effective safeguards.
West Virginia’s attorney general, JB McCuskey, said in a statement that Apple knowingly allowed its iCloud platform to be used to store and distribute CSAM. The lawsuit alleges the company repeatedly declined to deploy available detection technologies and did not take meaningful action to stop the distribution of illegal material. The complaint asserts that Apple’s approach prioritised user privacy over child safety and violated federal and state legal obligations.
The legal filing refers to internal communications in which an Apple employee allegedly described the company’s platform as a major conduit for the distribution of child pornography, according to West Virginia’s complaint. The state also cited figures from the National Center for Missing and Exploited Children (NCMEC) showing that Apple reported far fewer cases of CSAM to authorities than some other major technology companies in recent years.
Federal law requires companies based in the United States to report detected CSAM to NCMEC. West Virginia’s complaint said Apple filed 267 such reports in 2023, compared with millions reported by other major technology firms, and that Apple did not implement industry-standard tools that could have detected more instances of abusive material.
Apple has previously considered implementing a proprietary CSAM detection system but abandoned those plans after public criticism and concerns from civil liberties groups about potential misuse of scanning technologies. The company instead emphasised other safety features it deploys across its products, including tools intended to warn or protect users when sensitive content is detected in communication.
In its response to the lawsuit, Apple denied the allegations that it facilitated the spread of abusive material and reaffirmed its commitment to user safety and privacy. The company highlighted existing features aimed at protecting children, including built-in parental controls and content intervention tools.
The lawsuit in West Virginia is one of several legal challenges and public criticisms Apple has faced regarding its handling of CSAM, including prior litigation brought by victims of exploitation. The current case argues that the company’s decisions around detection and reporting tools amount to unlawful conduct and seeks changes to how Apple’s systems are designed in the future.
