A federal judge in the United States has ruled that Meta Platforms Inc., a US-based technology company that operates social networks including Facebook and Instagram, must face a lawsuit filed by the state of New Mexico over claims that its platforms contributed to child sexual exploitation. The decision clears the way for a trial on allegations that Meta failed to take sufficient steps to protect minors from harmful content and interactions.
The case was brought in 2024 by New Mexico’s attorney general under federal and state law, asserting that Meta’s platforms allowed users to share and access child sexual exploitation material and to engage in predatory behaviour. The lawsuit contends that the company’s content moderation systems and safety protections were inadequate to prevent such material from appearing and circulating on its services. Meta had sought to dismiss the case before trial, but the judge rejected those motions in a ruling issued in early February 2026.
Meta has argued in court filings that the allegations lack legal basis and that its efforts to combat harmful content are ongoing, including investments in safety technology and review teams. The company has said it has policies and systems designed to identify and remove prohibited material and to block accounts that violate its standards. Meta also pointed to partnerships with law enforcement and child protection organisations to report and address criminal activity on its platforms.
The state’s complaint cites specific examples of posts and accounts that were allegedly accessible on the platforms, though court records do not detail all of the evidence that will be presented at trial. New Mexico’s attorney general said the lawsuit is intended to hold Meta accountable for harms suffered by minors and to encourage stronger protective measures. Meta responded that it will vigorously defend its practices in court and that legal standards require careful evaluation of the claims.
The judge’s order allows the case to proceed to discovery, during which both sides will exchange information relevant to the allegations and prepare for trial. No trial date has been publicly announced. The ruling marks a significant step in a broader set of legal challenges facing social media and technology companies over content moderation, user safety, and protections for child users. Observers say the outcome of the forthcoming proceedings could influence how courts interpret platform responsibility for harmful material.