Australian authorities are assessing reports that social media platforms have not fully enforced the country’s ban on accounts for users under 16. The law, which took effect in December 2025, requires major platforms to block new under-16 registrations and deactivate existing accounts belonging to users below the legal threshold. The measure applies to services including Instagram, TikTok, Snapchat, and YouTube.
Reportedly, many teenagers continue to access social media despite the ban. Interviews with young users indicated that some were able to create new accounts by entering false birth dates, while others retained existing accounts that had not been removed by the platforms. Researchers said the ease of bypassing age restrictions suggests that compliance remains inconsistent across services.
Officials said the law aims to reduce exposure to harmful content, limit contact with unknown adults, and mitigate risks associated with cyberbullying. Regulators said platforms are expected to apply robust age verification measures and demonstrate that their systems can identify underage users. They said the government will request evidence of compliance and may consider further action if platforms do not meet legal requirements.
Youth groups and digital rights organisations have raised concerns about the practical effects of the ban. They said many teenagers rely on social media for communication, education, and community contact, and may move to less regulated platforms if mainstream services enforce the restrictions more strictly. Analysts warned that migration to unregulated spaces could reduce safety rather than improve it.
Some experts in child safety said stronger enforcement is necessary to uphold the intent of the law. They noted that current verification methods largely depend on self-declared age and do not prevent minors from creating new accounts. Others said a blanket ban does not address the varied needs of young people and suggested that improved content moderation, parental tools, and digital literacy measures may offer more effective support.
Platforms have responded unevenly. Some companies said they are implementing the required systems and removing under-16 accounts. Others said they are reviewing the technical and legal implications while continuing to engage with regulators. Industry representatives said accurate age verification remains one of the most difficult challenges for global platforms because available methods can introduce privacy and security issues.
Authorities said they will continue to monitor compliance and will assess whether additional rules or enforcement mechanisms are needed. They said the government intends to balance safety objectives with technology constraints and the need to protect user privacy.
