Australia’s online safety regulator, the eSafety Commission, is preparing to enforce a pioneering rule that requires social media platforms to restrict account access for users under 16 years old. The new regulation takes effect on 10 December 2025 and introduces substantial compliance obligations for companies that host social media services in Australia.
Under the new framework, platforms defined as “age-restricted social media platforms” must make “reasonable steps” to prevent Australians under the minimum age from creating or maintaining accounts.
Services identified initially include Facebook (Meta), Instagram, Snapchat, Threads, TikTok, X (formerly Twitter), YouTube, Kick, and Reddit.
The law’s scope is defined by criteria designed to capture services with interactive social features. To qualify as an age-restricted platform, a service must enable social interaction between two or more end-users, allow users to link to or interact with other users, and provide functionality for users to post content.
The regulator emphasises that the new rule is not a direct ban on children under 16 using social media. Instead, it places the responsibility on platforms to apply age assurance measures. No penalties will be imposed on under-16s or their parents for using such platforms. Rather, platforms may face civil sanctions up to A$49.5 million (AU$50 million) if they fail to comply.
Minister for Communications Anika Wells and eSafety Commissioner Julie Inman Grant argue the policy aims to give children “valuable time to learn and grow, free of the powerful, unseen forces of harmful and deceptive design features such as opaque algorithms and endless scroll”.
In preparation for the regulation’s commencement, eSafety has launched a public awareness campaign and made available resources for parents, carers, and educators. These include classroom materials, guidance documents, and a dedicated hub for the “Social Media Minimum Age” initiative.
While the eligible date has been set, some specifics of implementation remain under discussion. One notable issue concerns age verification and assurance technology. The regulator has stated it would be unreasonable to require platforms to reverify all users’ ages, and instead expects evidence of proportionate steps to detect under-16s.
Gaming platforms and messaging services are exempt from the age-restricted classification under the rules, provided they do not meet the key criteria described above. For example, online games or services solely focused on messaging are treated differently.
Experts note that although the law takes effect in December, many children under the age of 16 already access social media. A 2024 survey by the eSafety Commission found that 80 per cent of children under 13 had bypassed existing age restrictions by using family members’ accounts or false credentials.
Platforms will be required to demonstrate their compliance in the years following rollout. The Minister must initiate an independent review of the law’s effectiveness within two years after it comes into force.
Australia is moving to a new paradigm in online safety for young people by formally shifting responsibility onto platforms to manage underage access. Platforms and regulators now face the task of balancing child protection goals with user rights and platform functionality.