Tough web laws plan hailed by Belfast woman who experienced dark side of social media
A Belfast woman recovering from self-harm and an eating disorder has welcomed Government plans to introduce "world first" internet safety laws designed to make the UK the safest place in the world to be online.
A Government White Paper on online harm proposes strict new rules to make firms take responsibility for their users and their safety, as well as the content which appears on their services.
Published jointly by the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, it suggests punishing social media companies with large fines or blocking their sites from being accessed.
Overseen by an independent regulator, internet companies which break these rules could even see senior management held personally liable for the failings.
A newly-introduced duty of care will require firms to take more responsibility for the safety of users and more actively tackle the harm caused by content or activity on their platforms.
The regulator will have the power to issue "substantial fines, block access to sites and potentially impose liability on individual members of senior management".
Lucy Grainger (21) said that while the proposals are welcome, there is always "room for improvement" when it comes to helping to keep users safe from potentially dangerous images or posts.
Lucy witnessed her father die from a heart attack when she was three and, after trauma in her childhood, she developed bulimia and started self-harming.
By the age of 13 she had attempted suicide and her life was spiralling out of control.
She has previously spoken of how social media can be a "double-edged sword", admitting that while social media helped her to find a support network in her recovery, she has to be careful about its negative side, such as bullying and the feelings of anxiety it can cause.
Despite her challenges Lucy managed to complete her GCSEs, but unfortunately spent the following few years in and out of psychiatric wards and sleeping on friends' sofas.
Tragically, during this time Lucy lost several friends to suicide, while she herself made more than 100 suicide attempts.
Now an outspoken mental health advocate, she said social media firms have a duty of care to their users but believes that many of them are failing at this.
"I fail to see how it is possible for all this content to be removed, as it is next to impossible to review anything and everything that is put out there," she said. "I think as well as this ban, there should be more education in schools and colleges and for parents to teach them quite how dangerous social media can be.
"It is also vital that users take more responsibility, although at the same time, they should have freedom to post what they want within reason and the law.
"Bringing in these rules might make social media firms more vigilant."
"For these rules to come in it would be necessary for regulations and guidelines on social media to be more specific, so that it is clearer on what users can and cannot post," Lucy added.
Her stance has been echoed by Home Secretary Sajid Javid, who insisted yesterday that tech firms have a "moral duty" to protect the young people they "profit from".
"Despite our repeated calls to action, harmful and illegal content - including child abuse and terrorism - is still too readily available online," he said.
"That is why we are forcing these firms to clean up their act once and for all."
A 12-week consultation will now take place before ministers publish the draft legislation.
The proposed measures are a response to concerns over the growth of violent content, encouraging suicide, disinformation and the exposure of children to cyber-bullying and other inappropriate material online.
The proposed new laws will apply to any company that lets users share or discover user-generated content or interact with each other online - applicable to companies of all sizes from social media platforms to file-hosting sites - the Government has said.
It also calls for powers to be given to a regulator which would force internet firms to publish annual transparency reports - which Facebook and Twitter currently do - on the harmful content on their platforms and what measures they taking to address it.