Britain’s media and privacy regulators have warned major social media platforms to do more to prevent children from accessing their services, stating that companies are failing to enforce their own minimum age rules.
The United Kingdom is currently considering stricter restrictions on children’s access to social media, including a possible ban on users under the age of 16 from joining such platforms, similar to measures introduced in Australia.
Regulators Ofcom and the Information Commissioner’s Office (ICO) said they are increasingly concerned about algorithm-driven feeds that expose children to harmful or addictive content.
“These online services are household names, but they’re failing to put children’s safety at the heart of their products,” said Ofcom Chief Executive Melanie Dawes.
“That must now change quickly, or Ofcom will act,” she warned.
As part of the latest phase of implementing the Online Safety Act, Ofcom has instructed platforms including Facebook and Instagram, both owned by Meta, as well as Roblox, Snapchat, ByteDance’s TikTok and Alphabet’s YouTube to outline by April 30 how they plan to strengthen age verification systems, limit contact from strangers, improve the safety of content feeds and stop testing new products on minors.
Meanwhile, the Information Commissioner’s Office issued an open letter to the same companies urging them to adopt modern age-verification technologies to prevent children under 13 from accessing services not designed for them.
“There’s now modern technology at your fingertips, so there is no excuse,” said Paul Arnold, Chief Executive of the ICO.
Under the Online Safety Act, Ofcom has the authority to fine companies up to 10 percent of their qualifying global revenue for non-compliance. The ICO can impose fines of up to 4 percent of a company’s global annual turnover.
Last month, the privacy regulator fined Reddit nearly £14.5 million for failing to implement effective age-verification measures and for unlawfully processing children’s data.
Source: CNA.




