Australia has begun enforcing legislation that requires social media platforms to block users under 16 or face major financial penalties. The law took effect on 10 December 2025 and applies to widely used services, including Facebook, Instagram, TikTok, Snapchat, YouTube, X, Reddit, Threads, Twitch, and Kick. Regulators said platforms must take reasonable steps to identify existing accounts held by underage users and prevent new registrations by those below the minimum age. The rule does not allow exceptions based on parental consent and places full responsibility on the platforms.

 

 

The penalties for non-compliance are significant. Companies that fail to enforce the age restriction may face fines of up to AUD 49.5 million. Officials said the size of the fines reflects the seriousness of underage use and is intended to ensure that enforcement becomes a priority for social media firms. They stated that earlier voluntary measures did not deliver consistent outcomes and that a legal penalty was required to ensure industry-wide adherence. Regulators also noted that young users and their families will not be penalised because the law targets corporate compliance rather than individual behaviour.

Government representatives said the legislation was introduced in response to rising concern about the effects of social media on young people. They highlighted risks including exposure to harmful content, cyberbullying, and pressure associated with engagement metrics. Supporters of the measure argue that strong enforcement mechanisms are necessary because the potential harms occur at scale. They added that the requirement to block underage users must be backed by substantial fines to guarantee meaningful action from platforms with large global user bases.

Implementation of the law requires platforms to introduce or improve age verification methods. Regulators said that these may include age estimation tools or other checks. They acknowledged that no system is entirely accurate and that some minors may attempt to circumvent verification by providing false details. Privacy specialists said that age checks must respect data protection requirements and avoid unnecessary collection of sensitive information. Authorities responded that enforcement will focus on whether platforms take reasonable and proportionate action consistent with the law.

Industry responses have been mixed. Some companies have stated that they will comply. Others said that maintaining accuracy at scale presents technical difficulties. Analysts suggested that the threat of large fines is likely to accelerate the development of age assurance technology and more rigorous reporting systems. Officials said platform compliance will be monitored and that additional guidance may follow once early enforcement data is reviewed.

Authorities described the measure as a major shift in national online safety policy. They said that the penalties are designed to make non-compliance financially untenable and to ensure that the protection of minors is treated as a core business requirement for social media firms.

Leave a Reply