
British authorities are turning up the heat on tech giants, calling for more rigorous age verification measures to keep children under 13 off social media platforms.
Media regulator Ofcom and the Information Commissioner’s Office (ICO) have jointly reached out to seven major companies—Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X—urging them to implement stronger age checks comparable to those currently used for adult-only services.
The message from regulators is clear: platforms must do far more to ensure younger children aren’t slipping through the cracks and accessing services they’re not meant to use.
Ofcom chief executive Dame Melanie Dawes didn’t mince words, stating that these services are currently “failing to put children’s safety at the heart of their products.”
The Scale of the Problem
While most social media platforms set their minimum age at 13, Ofcom’s research reveals a striking disconnect—approximately 86% of children aged 10 to 12 already have their own social media profiles. This suggests current age verification methods aren’t doing enough to prevent underage sign-ups.
The regulators want companies to voluntarily adopt “highly-effective age checks”—the kind currently required by law only for platforms hosting adult content like pornography. For social media sites, implementing such measures would be a voluntary but significant step forward.
The ICO’s concerns center on data protection, with chief executive Paul Arnold emphasizing that platforms processing personal data from children under their stated minimum age are likely doing so without a valid legal basis.
Government Backing
Technology Secretary Liz Kendall threw her weight behind the regulators’ efforts, making it clear that no platform would receive a “free pass” when children’s safety is at stake. She stressed that Ofcom has her full support in holding these companies accountable, adding pointedly, “No company should need a court order to act responsibly to protect children.”
How Platforms Are Responding
The tech companies have pushed back, defending their existing safeguards while outlining additional measures.
YouTube’s parent company Google expressed surprise at Ofcom’s approach, urging regulators to focus instead on “high risk services that are failing to comply” rather than taking what it sees as a blanket approach.
Meta, which owns both Facebook and Instagram, highlighted measures already in place, including AI-powered age detection based on user activity and facial age estimation technology. The company also suggested that shifting age verification responsibilities to app stores would streamline the process for parents and teens.
Snapchat noted it’s currently testing age verification tools, while TikTok pointed to its “enhanced technologies” for detecting and removing underage accounts. TikTok also claimed to be the only major platform transparently publishing removal numbers, reporting over 90 million suspected under-13 accounts taken down between October 2024 and September 2025.
Roblox emphasized its additional protections for younger users and noted it had released 140 new safety features over the past year, including mandatory age checks for accessing chat functions. A spokesperson expressed willingness to demonstrate these efforts in ongoing discussions with Ofcom.
The regulators’ push signals a growing determination to make online spaces safer for children, even if it means compelling tech giants to go beyond their current voluntary measures.