The Government has insisted it will take tough action against technology companies which do not adhere to its planned online harms regulations.
Home Secretary Priti Patel said new internet regulator Ofcom “will have teeth” to ensure social media companies follow its new duty of care rules.
Writing in the Daily Telegraph, Ms Patel said she would soon be announcing the “tough” enforcement tools at the disposal of Ofcom, which form part of the Government’s plans to regulate social media platforms.
“With nearly 80% of young people saying they have experienced a harm while browsing online, regulating this space is not just logical but necessary,” she said.
“Ofcom will have teeth when it comes to holding tech companies to account.”
On Wednesday, the Government published its first response to a consultation around its online harms White Paper, released last year, and set out its plans to empower Ofcom to hold internet companies to account if they fail to protect users from harmful material.
The White Paper proposed stricter regulation for online companies, including large fines and personal liability for executives of firms found to be in breach of a statutory duty of care.
However, the Government did not set out potential enforcement powers in its first response, and will instead publish those details in its full response, due in the spring.
Tory MP Julian Knight, chairman-elect of the Commons Digital, Culture, Media and Sport (DCMS) Committee, said the punishments must be robust and include the threat of prison sentences.
“The regulator must take a muscular approach and be able to enforce change through sanctions that bite,” he said.
“That means more than a hefty fine – it means having the clout to disrupt the activities of businesses that fail to comply and, ultimately, the threat of a prison sentence for breaking the law.”
However, Digital Minister Matt Warman played down the prospect of criminal sanctions, saying on Wednesday that that would be “an extreme option”, but did confirm that the scope of the watchdog’s powers could include heavy fines and the naming and shaming of individual executives.
Concerns about the impact of social media on vulnerable people come amid suicides such as that of 14-year-old schoolgirl Molly Russell in 2017, who was found to have viewed harmful content online.
Molly’s father, Ian, who now campaigns for online safety, spoke of the urgent need for tighter regulation in the foreword to a Royal College of Psychiatrists report, in which he described the “wrecking ball of suicide” that “smashed brutally” into his family, blaming “pushy algorithms”.
He said of his daughter’s social media accounts: “Among the usual schoolfriends, pop groups and celebrities followed by 14-year-olds, we found bleak depressive material, graphic self-harm content and suicide-encouraging memes.
“I have no doubt that social media helped kill my daughter.”
While the decision to appoint a social media regulator has been widely praised by children’s and victims charities, privacy campaigners warned that the move is a threat to free speech.
Jim Killock, executive director of the Open Rights Group, said: “Private companies would be deciding what is legal or illegal, and will always remove more than they need, rather than less.
“Instead, the Government should seek to ensure that companies have sufficient independent scrutiny of their actions. This is known as co-regulation, and could be supervised by Ofcom.”
The trade body representing many social media companies – the Internet Association – also said it wanted further debate with the Government on the proposals.
The group, which represents internet firms including Amazon, Facebook, Google, Microsoft and Twitter, said several “issues of concern” remained, including the potential punishments for not removing content which could be considered harmful but is not illegal.