Britain’s internet safety laws are “very patchy” and “unsatisfactory”, Technology Secretary Peter Kyle has said, following calls from campaigners for tougher rules.
On Saturday, Ian Russell, the father of Molly Russell, who took her own life aged 14 after seeing harmful content online, said the UK was “going backwards” on the issue.
In a letter to the Prime Minister, Mr Russell argued that the Online Safety Act, which aims to force tech giants to take more responsibility for the content of their sites, needed to be fixed and said that a “duty of care” should be imposed on businesses.
Speaking to the BBC’s Laura Kuenssberg, Kyle expressed his “frustration” with the law, which was passed by the previous Conservative government in 2023.
The Conservative government had initially included plans in the legislation to force social media companies to remove certain “legal but harmful” content, such as posts promoting eating disorders.
However, the proposal sparked a backlash from critics, including current Conservative leader Kemi Badenoch, who feared it could lead to censorship.
In July 2022, Badenoch, who was not then a minister, said the bill was “not fit to become law”, adding: “We should not legislate to hurt feelings”.
Another Conservative MP, David Davis, said it risked being “the biggest accidental restriction of free speech in modern history”.
The plan was scrapped for adult social media users and companies had to give users more control to filter content they didn’t want to see. The law still expects companies to protect children from legal but harmful content.
Kyle said the section on legal but harmful content had been removed from the bill, adding: “So I have inherited a landscape where we have a very uneven and unsatisfactory legislative settlement.”
He did not commit to making changes to the current legislation but said he was “very open-minded” on the subject.
He also said the law contained “very good powers” which he was using to “confidently” tackle new security issues and that in the coming months ministers would get the powers needed to ensure platforms online provide age-appropriate content.
Companies that do not comply with the law would face “very severe” sanctions, he added.
Following the interview, a Whitehall source told the BBC that the government was not considering repealing the Online Safety Act, nor bringing in a second law, but working within the confines of what ministers consider as its limits.
Ministers are not ruling out further legislative measures but wanted to “be agile and quick” to keep up with rapidly evolving trends, a source said.
In his letter, Ian Russell argued that “worrying” changes in the tech industry are putting increased pressure on the government to act.
He said Mark Zuckerberg, the boss of Meta, owner of Facebook and Instagram, and Elon Musk, owner of social media site X, were “at the forefront of a global recalibration of the industry.”
He accused Zuckerberg of moving away from safety and toward a “laisser-faire, anything-goes model” and “returning to the harmful content that Molly was exposed to.”
Earlier this week, Zuckerberg said Meta would get rid of fact-checkers and instead adopt a system – already introduced by they judge it to be false.
This marks a change from Meta’s previous approach, introduced in 2016, where third-party moderators would fact-check posts on Facebook and Instagram that appeared false or misleading.
Content flagged as inaccurate would be moved lower in users’ feeds and accompanied by labels offering viewers more information on the topic.
Defending the new system, Zuckerberg said moderators were “too politically biased” and it was “time to get back to our roots around free speech.”
The move comes as Meta seeks to improve relations with new US President Donald Trump, who has previously accused the company of censoring right-wing voices.
Zuckerberg said the change – which only applies in the US – would mean content moderators would “detect fewer bad things”, but would also reduce the number of “innocent” posts removed.
Responding to Russell’s criticism, a Meta spokesperson told the BBC that there was “no change to the way we deal with content that encourages suicide, self-harm and eating disorders ” and said the company would “continue to use our automated systems to seek this high level.” -gravity content”.
Asked about the change, Kyle said the announcement was “an American statement for users of American services”, adding: “There is one thing that has not changed and that is the law of this country.”
“If you come to do business in this country, you follow the law, and the law says that illegal content must be removed,” he said.
Online Safety Act rules, due to come into force later this year, require social media companies to show they are removing illegal content, such as child sexual abuse, content that encourages violence and publications encouraging or facilitating suicide.
The law also states that companies must protect children from harmful content, including pornography, content that encourages self-harm, bullying, and content that encourages dangerous stunts.
Platforms will need to adopt “age-guaranteed technologies” to prevent children from viewing harmful content.
The law also requires companies to take action against illegal state-sponsored disinformation. If their services are likely to be used by children, they must also take steps to protect users from misinformation.