The European Parliament has approved a resolution calling for a minimum age of 16 for access to social media, video-sharing platforms and AI-based services across the European Union. The measure is not legally binding but represents the strongest statement yet from lawmakers who believe that minors should not be allowed unrestricted access to online platforms. The resolution passed with 483 votes in favour, 92 against and 86 abstentions. It recommends a complete ban for children under 13 and access for 13 to 16-year-olds only with parental consent. Supporters argue that this approach would create consistent rules for all member states and ensure that major platforms are held to a common standard.

 

 

Lawmakers who backed the proposal cited studies suggesting that many young people in Europe display patterns of problematic smartphone use, including compulsive checking, reduced concentration and increased anxiety. They argue that these behaviours may be linked to extended engagement with social media. Concerns about content exposure, direct messaging and algorithmic promotion of harmful material have also become recurring themes in parliamentary debates. These issues have motivated legislators to call for stronger limits on platform features that they believe are engineered to maximize engagement rather than safeguard younger users.

The resolution urges regulators to introduce strict controls on design practices that may contribute to addictive behaviours. It identifies features such as infinite scroll, continuous auto-play and reward loops, as well as algorithms that elevate material shown to increase viewing time. It also highlights influencer marketing aimed at minors and advertising practices that rely on behavioural profiling. Under the Parliament’s view, these elements should face tighter scrutiny under existing frameworks such as the Digital Services Act. Lawmakers argue that platforms that fail to meet new obligations should face penalties or further restrictions.

A major part of the proposal focuses on age assurance. The Parliament recommends that platforms adopt accurate and privacy-respecting systems capable of determining user age without collecting excessive personal information. Options may include digital identity checks or other verification tools, although lawmakers stress that any system must uphold privacy rights. They emphasise that age assurance systems need to avoid expanding data collection beyond what is necessary. These requirements reflect a growing belief within the EU that platforms should be responsible for verifying user age, rather than relying solely on self-reporting.

Several member states have already discussed or introduced age-related restrictions. Denmark has considered rules that would raise the minimum age for using social media. Other governments have signalled that they are willing to adopt stronger controls. The EU-level proposal aims to harmonise this activity to prevent uneven standards across the bloc. Lawmakers argue that consistent rules would reduce confusion for users and provide a coherent framework for enforcement.

Critics warn that the proposal may create unintended consequences. Civil liberties groups argue that restricting access for under-16-year-olds could limit freedom of expression and participation in online communities. They caution that younger users might turn to unregulated or less safe platforms if they are blocked from mainstream services. Privacy advocates also object to the potential requirement for identity checks, raising concerns about biometric verification, document submission and the long-term security of sensitive data. They argue that these risks could outweigh the intended benefits.

Supporters counter that stronger age controls will protect young users. They claim that many platforms currently lack effective safeguards and that consistent requirements will push the industry toward safer defaults. They argue that the resolution encourages improvements in age assurance technology, reduces reliance on engagement-driven algorithms and promotes design choices intended to protect minors.

Because the resolution is not binding, the next step requires the European Commission to decide whether to draft legislation. If the Commission proceeds, any legal framework would need to pass both the Parliament and the Council. This process would include consultations with national governments, civil society organisations and major technology companies. Observers expect significant debate if legislation is introduced, as questions about privacy, data collection and enforcement remain unresolved.

The broader global context shows similar patterns. Several countries are developing or reviewing laws aimed at restricting youth access to online platforms. These laws vary widely but often include requirements for age verification, limits on data collection and tighter controls on content exposure. Analysts believe the EU’s position may shape international discussions and could influence how platforms design their services for younger users. They note that large technology companies may be compelled to adopt age assurance tools across multiple jurisdictions if the EU mandates them.

Incoming search terms:

Leave a Reply