Australia has begun enforcing legislation that prevents people under 16 from holding accounts on major social media platforms. The law took effect on 10 December 2025 and applies to services including TikTok, Instagram, Snapchat, YouTube, X, Facebook, Threads, Reddit, Twitch and Kick. Under the new rules, platforms must block existing accounts held by users under 16 and must prevent new registrations from this age group. The restriction applies nationwide and does not allow exceptions based on parental consent. The government stated that the measure is intended to reduce exposure to harmful content and to address concerns about the impact of social media use on children.

 

 

The legislation introduces significant penalties for companies that fail to comply. Platforms can face fines of up to AUD 49.5 million if they do not take what the law defines as reasonable steps to identify underage users and remove or block access. Government officials said the responsibility for enforcement lies with the platforms because they have access to user data that can be used to verify age. The rules are structured to ensure that social media services prioritise child safety when managing account creation and access.

The government has said that the primary aim of the ban is to reduce online harms that disproportionately affect younger users. These concerns include cyberbullying, exposure to violent or age-inappropriate material and contact from unknown adults. Officials have also referenced research that links intensive social media use to declining well-being among children. Advocacy groups in support of the measure argue that delaying access to social media may reduce pressure related to appearance standards, popularity metrics and constant comparison with peers. They say that children benefit from more time to develop offline social skills before entering online environments that present complex emotional and social challenges.

There are concerns among some community groups about possible unintended consequences. Critics argue that the ban may exacerbate isolation for young people in regional and remote communities, who rely on social media to maintain connections with their peers. Some teachers and youth workers reported that online messaging platforms serve as important communication tools in areas where face-to-face contact is limited. Others cautioned that the restriction may prompt underage users to turn to less regulated platforms that lack safety controls. These concerns have prompted discussion about how to balance child protection with digital inclusion.

Enforcement poses practical difficulties because platforms must use age verification systems to identify underage users. Some companies have introduced automated tools to estimate user age by analysing profile behaviour or uploaded images. Privacy experts said these tools raise questions about accuracy and the handling of biometric or behavioural data. They also noted that determined minors may still circumvent the checks by using false information. Government officials said enforcement will develop over time and will rely on cooperation between industry and regulators.

The ban has attracted international attention as other countries review their own youth safety policies. Researchers and digital safety organisations plan to follow the effects of the Australian rules to assess whether they reduce documented harms while maintaining access to digital resources that many families consider essential. The government has stated that further adjustments may follow after the first period of implementation and after consultation with schools, parents and industry representatives.

Leave a Reply