Getty
Ofcom has warned social media companies they will be punished if they do not take significant further action to tackle the problem of children pretending to be adults online.
A recently released survey, carried out by the UK's media regulator, indicates that 22% of young people aged 8 to 17 lie about being 18 or over on social media apps.
This is despite the Online Safety Act (OSA) requiring platforms to strengthen age verification, a responsibility that will come into force in 2025.
Ofcom told the BBC its “alarming” findings showed tech companies had a long way to go to comply with this new legal standard – and said they would face enforcement action if they didn't not.
He said the fact that children can pretend to be adults increases their risk of being exposed to harmful content.
“Platforms need to do a lot more to know the ages of their children online,” Ian Mccrae, director of market intelligence at Ofcom, told the BBC.
He added that 2025 was a “huge year” in which there should be a “real step change in online safety”.
He said Ofcom would “take action” if companies failed to comply with the OSA, pointing out that the legislation allowed companies to be fined 10% of their global turnover.
“So easy to lie”
Myley says she hasn't encountered any real age verification on social media
A number of tech companies have recently announced measures to make social media safer for young people, such as Instagram launching “teen accounts”.
However, when BBC News spoke to a group of teenagers from Rosshall Academy, Glasgow, all said they were using adulthood for their social media accounts.
“It’s so easy to lie about your age,” 15-year-old Myley said.
“I put my actual birthday – like the day and month – but when I get to the year, I go back ten years,” she added.
“There is no verification, they don’t ask for ID, they don’t ask for anything,” adds Haniya, another 15-year-old student.
BBC News was also not challenged when it created accounts, using newly created email addresses, on a number of major platforms.
A user aged over 18 was contacted without any proof being requested.
Ofcom says this will have to change in the coming months.
“Self-reporting a child’s age is clearly completely insufficient,” Mr McCrae said.
Age insurance
The public is deeply concerned about children being exposed to harmful content online, in part because of the high-profile deaths of teenagers Molly Russell and Brianna Ghey.
This led the latest government to pass the OSA which, from July 2025, will require social media platforms to implement what Ofcom calls “highly effective age assurance”.
The company did not specify what technology should be used to strengthen the verification process, but said it was testing several systems in its own labs and would have “more to say” in the new year.
The BBC contacted the most popular platforms for children and young people in the UK to find out their responses.
“Every day we remove thousands of suspected underage accounts,” TikTok said in a statement.
“We are exploring how new machine learning technologies can enhance these efforts and are co-leading an initiative to develop industry-wide age assurance approaches that prioritize safety and respect the rights of young people,” he adds.
Snapchat and Meta – owners of WhatsApp, Instagram and Facebook – declined to make statements.
X, formerly Twitter, did not respond to the BBC's request for comment.
The government has already come under pressure to strengthen the Online Safety Act, with some saying it does not go far enough.
The Australian government is considering banning social media for under-16s – a move Technology Secretary Peter Kyle has previously said he is prepared to emulate.