Everything you need to know about Government plans to tackle online harm
A White Paper setting out measures to tackle online harms is due to be published on Monday.
Plans to make tech giants and social networks more accountable for harmful content online are set to be announced by the Government next week, in a bid to make the UK one of the safest places in the world to be online.
The move has been met with mixed reaction, with many welcoming tighter rules, while others have expressed concerns about internet freedoms.
Here is everything you need to know about the long-awaited White Paper so far and what to expect:
– Why is the Government cracking down on online content?
The Government wants to tackle a whole range of online harms – including cyberbullying, revenge porn and hate speech – as well as access to content they believe has an impact on people’s mental health, such as images of suicide, and distribution of material relating to terrorism.
It has come to the conclusion that self-regulation is no longer working and therefore wants to introduce new legally-binding measures that make tech companies hosting the content responsible for blocking it or removing it swiftly.
Vital discussions at G7 around online terror content – especially after horrors of Christchurch. I made clear our upcoming Online Harms White Paper will ensure social media firms take more responsibility. Much more global action needed in this area #G7France (1/2) pic.twitter.com/Kc9aMPCIRG— Sajid Javid (@sajidjavid) April 4, 2019
The urgency to act has been highlighted by a number of cases, such as teenager Molly Russell, who was found to have viewed content linked to self-harm and suicide on Instagram before taking her own life in 2017.
More recently, material relating to terrorism has also been a concern, following the mosque attack in New Zealand which was livestreamed on Facebook.
– What is the White Paper likely to include?
According to leaked plans obtained by the Guardian, the White Paper could include ways to make social media bosses personally liable for violating content on their platforms.
It is claimed that ministers will push to legislate for a new “duty of care” to be policed by an independent regulator, likely to be Ofcom initially, who will have the power to impose substantial fines.
Social media companies could be forced to produce transparency reports every year, detailing the amount of harmful content found on their platforms and how they addressed the issue.
It may also include rules requiring co-operation with police on any illegal harms identified.
– When will it be published?
The Government is planning to release the White Paper on online harm on April 8.
– Will this mean a new regulator?
According to reports, Ofcom may take on the task initially, but a new body could follow in the future.
– What do supporters say?
Supporters for tighter rules include Ian Russell, the father of teenager Molly Russell.
“Up until now they have chosen their own course,” Mr Russell said last month.
“Governments have allowed social media platforms to be self-regulated, but remember this really is a matter of life and death and it’s getting worse.”
“Now is the time for the UK Government to bring effective internet regulation, with strong sanctions as back-up.”
Children’s charity the NSPCC has also backed changes, with head of online child safety Andy Burrows saying: “Unless we have regulation that is capable of protecting children in the way we know is necessary, then we will see further tragedies with children coming to harm.”
– What do opponents say?
Opponents of plans so far are concerned that the measures could amount to censorship.
Freedom of expression organisation Article 19 said: “While social media companies could do more to address incitement to violence, child sexual exploitation and other problematic content on their platforms, the Government must not create an environment that encourages the censorship of legitimate expression.
“Article 19 strongly opposes any ‘duty of care’ being imposed on internet platforms. We believe a duty of care would inevitably require them to proactively monitor their networks and take a restrictive approach to content removal. Such actions could violate individuals’ rights to freedom of expression and privacy.”
– What do tech companies make of the prospect of regulation?
During a visit to Ireland earlier this week, Facebook boss Mark Zuckerberg said he would work with governments to establish new policies in a bid to regulate social media.
The firm’s founder and chief executive reportedly told the politicians that issues around child protection and age verification are of “huge concern” to him.
“Either way we’re going to have responsivity for making sure that we can police harmful content and gets it off our services,” Mr Zuckerberg told RTE News.