2 Remove Virus

UK regulator pushes tech firms to strengthen age checks for children

The United Kingdom’s data protection regulator has called on technology companies to strengthen age-verification systems so that children cannot access online services by selecting an “I’m 13” option during registration.

 

 

The request was issued by the Information Commissioner’s Office, the UK authority responsible for enforcing data protection and privacy laws. The regulator said many online platforms rely on self-declared ages when users create accounts, a method that children can easily bypass.

According to the regulator, services that set a minimum age for users must implement more reliable “age assurance” systems that verify whether someone is old enough to use a platform. These systems can include a range of methods, such as identity checks, facial age estimation technology, or other tools designed to estimate a user’s age without relying solely on a declaration.

The warning applies to social media platforms, video sharing services, and other online products likely to be accessed by children. The regulator said companies should use technology that enforces their minimum age requirements rather than allowing users to simply confirm they meet the age threshold.

Many large online services set their minimum user age at 13. However, research cited by regulators shows that younger children frequently access these platforms despite the age limits. One study found that about 72% of children aged eight to 12 use services that technically require users to be at least 13 years old.

The regulator said that reliance on self-declaration does not provide sufficient protection for children’s data and privacy. Platforms are expected to evaluate available age assurance technologies and introduce measures capable of preventing underage users from registering or accessing services meant for older audiences.

The guidance forms part of the UK’s broader framework for online safety and child protection. The country’s Online Safety Act requires companies operating digital services to reduce risks to children and ensure that platforms are designed with younger users’ safety in mind.

Under existing UK rules, online services likely to be accessed by children must also comply with the Age Appropriate Design Code, which sets standards for how companies handle children’s personal data and privacy settings. The code requires platforms to prioritize the safety and best interests of younger users when designing digital services.

The regulator said it has begun engaging directly with companies that operate higher-risk services and expects them to review their age assurance systems. Further regulatory action may follow if platforms fail to implement stronger protections for children accessing online services.