Five British families have filed a wrongful death lawsuit against the social media platform TikTok, owned by the Chinese technology company ByteDance, in the Superior Court of the State of Delaware. The parents allege that the platform’s content recommendation systems pushed dangerous “prank and challenge” videos to their children, contributing to their deaths after they attempted self-strangulation challenges. The case is scheduled for an initial hearing on a motion to dismiss.

 

 

The lawsuit was brought on behalf of the estates of five children aged between 11 and 14 who died in 2022. The families argue that TikTok’s algorithms “selected and pushed” harmful material to the children’s feeds, including content related to a so-called blackout challenge. The blackout challenge is an internet stunt linked in media reports to several child fatalities and involves intentional choking to induce unconsciousness.

The complaint asserts that the platform’s recommendation systems “targeted” minors and then “flooded” them with dangerous videos on the app’s For You page, TikTok’s personalised content feed. The families say this exposure contributed to the children’s decisions to attempt self-strangulation. They have sought access to internal platform data that could show what content each child was exposed to before their deaths, but allege that much of this data has since been deleted under TikTok’s data retention practices. TikTok has said the blackout challenge was banned on the platform in 2020 and that it has not found evidence of it trending.

TikTok argues that it prohibits content that promotes or encourages dangerous behaviour and that it removes the majority of violations before they are reported. The company is seeking to dismiss the lawsuit on jurisdictional and other legal grounds. If the motion to dismiss is denied, the case will proceed to a stage known as discovery, where the court can compel TikTok to disclose internal records and relevant account information. The families say they are not primarily seeking financial compensation but want clarity on what their children saw online and to hold the company accountable for its content policies.

The parents’ legal action highlights ongoing debates about the responsibilities of social media companies in moderating harmful content that may reach minors. In the United Kingdom, some of the parents have also campaigned for changes in law that would require preservation of a deceased child’s online data, a proposal referred to by supporters as “Jools’ law,” named after one of the children who died. They argue that automatic preservation of digital records could help provide answers and prevent similar cases in the future.

The lawsuit follows other cases in the United States in which courts have considered whether platform recommendation systems fall within legal protections that shield online services from liability for third-party content. A 2024 decision by the United States Court of Appeals for the Third Circuit held that algorithmic recommendations could be treated as the platform’s own activity rather than merely third-party content, affecting how immunity under US law applies.

Leave a Reply