Denmark has agreed to ban the use of social media platforms by children aged 15 and under, with limited exceptions for those aged 13 and above who receive parental authorization. Prime Minister Mette Frederiksen announced the measure, which will be subject to legislation in the national parliament. The government cited growing concerns over youth mental health, screen time, and exposure to harmful content as the rationale for taking this unprecedented step.
Frederiksen noted that mobile phones and social networks are “stealing our children’s time” and pointed to data indicating high rates of anxiety, depression, and concentration difficulties among young people in Denmark. She said nobody has previously seen so many children suffer from these ailments and added that on screens, children “see things no child or young person should see”.
Under the proposed rules, children under 15 would not be permitted to sign up for social media platforms unless there is parental approval. The draft regulation leaves room for young people aged 13 to 15 to join platforms if their parents give explicit consent. However, officials have not yet published a definitive list of platforms that will be subject to the ban or the dates by which the legislation will be adopted.
Challenges, enforcement, and wider implications for Europe
The policy raises several practical questions for parents, service providers, and regulators. Among the unknowns is how age verification will be implemented effectively, given that many children already access social media by using a parent’s credentials, bypassing stated age limits. According to Danish research, 94 percent of seventh graders reported having created a social media profile before reaching thirteen.
Critics have argued that outright bans may simply drive young users to less-regulated platforms or force them to share credentials with older peers or parents. They suggest the real issue lies in requiring social media companies to implement robust age verification rather than imposing blanket restrictions.
Meanwhile, the government pointed to precedents in other countries. Australia in 2024 introduced restrictions for children under 16, and other European states are reportedly exploring similar moves. Greece, Italy, Spain, France, and Norway have signalled intentions to tighten youth access to social media or apply age-based rules.
From a regulatory perspective, the Danish action could set a benchmark within the European Union for how online child protection is enforced. If Denmark’s legislation passes, it may trigger a domino effect across the bloc, as member states react to pressure from advocacy groups and public data on screen usage and youth wellbeing.
For families and educators, the proposal emphasises the need to adopt safe digital habits and to understand how children interact online. Experts recommend setting clear household rules for screen time, discussing online content risks with young people, and supporting healthy offline activities. Schools and parenting organisations say digital literacy must include privacy, emotional resilience, and critical thinking about social media.
On the industry side, social media companies may face new obligations under Danish law and potentially across Europe. The regulation could require platforms to verify age, restrict access, refine content filters for younger users, and report usage data to regulators. Given the global reach of major platforms, compliance may involve significant technical, legal, and policy adjustments.
As the bill moves through parliament, Danish regulators will need to establish enforcement mechanisms. Questions remain about how to monitor compliance, deal with cross-border platforms, and protect younger users without unduly restricting freedom of expression. The debate now centres on how to balance child protection with access to digital resources that offer social and educational value.