Governments around the world are intensifying efforts to protect children from risks that social media can pose. Since 2023, several US states have passed legislation requiring a minimum age for account creation, banning “addictive feeds” or limiting the time children can spend on social media. Australia set a national precedent in late 2024 by establishing 16 as the minimum age for social media use. Similar proposals are now under consideration in Ireland, New Zealand, Singapore and Spain, while France’s President Macron recently raised the possibility of a referendum on the issue.
Why governments are pushing for new social media age limits
While social media can offer children opportunities for connection and creativity, it can also present serious risks. Child sexual offenders often exploit these platforms to access children. Upcoming OECD research identifies 50 services that are heavily misused to facilitate child sexual exploitation and abuse (CSEA), including several mainstream social media platforms.
But social media also presents other risks to children. For example, OECD analysis shows that problematic social media use – marked by preoccupation, escapism and conflict – particularly affects girls.
Momentum behind legislative action also reflects a growing recognition that many platforms have not been designed with children’s safety or well-being in mind and have failed to effectively enforce their own minimum age requirements.
The legal landscape for age assurance is fragmented
Many platforms have long based age limits on privacy and data protection law, with 13 being the most common threshold. However, a forthcoming OECD report examining the legal landscape for age assurance across member countries reveals that age limits vary not only between jurisdictions, but also across legal disciplines – such as privacy, safety and consumer protection. This patchwork of standards creates confusion for global platforms and leads to inconsistent protections for children.
In addition to regulatory fragmentation, a forthcoming benchmarking study of 50 services popular among children finds that, while many platforms set a minimum age, these are often vague, legally complex or can be overridden by parental consent. Terms such as “minimum age at which a person may use the service in their country” or the “legal age to enter into a binding contract” can confuse users or hinder transparency.
Moreover, platforms frequently shift the responsibility of setting age-appropriate safeguards onto parents. Controls related to contact, data use, or content like advertising, are often not enabled by default, can be difficult to manage and are easily disabled.
Current enforcement of age limits on social media remains a challenge
Despite the common use of 13 as an age limit, social media use among younger children remains widespread: in 2023, nearly 40% of US children aged 8-12 used social media and 63% of children aged 8-11 in the United Kingdom. A Canadian study found that 86% of children under 13 held accounts on at least one platform that prohibits users below that age.
The failure to verify a user’s real age has serious consequences. Children can create accounts with falsified birthdates, undermining age-tiered safeguards. For example, a child who signs up as a 13-year-old at age 8 could be treated as an 18-year-old by the time they are 13, gaining access to potentially harmful features such as direct messaging or livestreaming. A 2022 United Kingdom study found that around one-third of children aged 8 to 17 have an adult (18+) user profile on at least one social media platform, with 23% of these children aged 8-12.
Moreover, age-tiered safeguards only work if the service knows a user’s real age. However, only two of the 50 services in the OECD benchmarking study routinely assure age upon account creation. Most still rely on self-declaration or only assure age in specific cases – such as when suspicious activity is detected or for access to certain features. Some platforms do not assure age at all.

Strengthening age assurance through collaboration is key
Governments are increasingly considering legislation to set minimum age requirements for children’s access to social media. This reflects concerns that defaulting to privacy age limits does not fully account for what makes a service safe and age-appropriate.
While a one-size-fits-all solution may not exist, there is a clear need for a more consistent and coherent approach that considers both safety and children’s developmental needs. As countries move ahead with their own measures, the risk of fragmented standards grows – creating more complexity for platforms and uneven protections for children. Any new age limits must be grounded in robust, evidence-based assessments of both risks and benefits.
Implementing effective age assurance is equally urgent and complex. It requires technical innovation, regulatory clarity and strong collaboration between the public and the private sector. While technical solutions are rapidly advancing, concerns remain about platforms’ ability to assure age reliably and at scale. Governments, in turn, must provide clear, co-ordinated frameworks that uphold children’s rights and safety while balancing overlapping regulatory mandates – such as those overseeing online safety and data protection.
To support all relevant actors, the OECD plans to conduct research on age assurance technologies and age appropriateness for digital services used by children. The OECD will also bring together policymakers and key stakeholders to promote a more consistent, evidence-based approach – one that focuses on safety, respects children’s rights, and reflects the realities of their evolving lives in the digital world.