Call for action as survey reveals 200,000 kids aged 11-17 possibly groomed online
More than 200,000 secondary school children may have been groomed online, research suggests.
A survey carried out on behalf of the NSPCC found that around 4% of young people aged 11 to 17 questioned had sent, received or been asked to send sexual content to an adult when using various sites and apps.
The charity said that one in 25 children had done so using Snapchat, Facebook or Facebook Messenger; one in 33 using Twitch and Twitter, and one in 50 using Instagram and WhatsApp.
A total of 2,004 young people aged 11 to 17 were asked if they had ever sent messages with sexual content, been sent a naked picture or video or been asked to send these images, and the age of the person they interacted with.
NSPCC chief executive Peter Wanless said: "The scale of risk that children face on social networks revealed in this research cannot be ignored and tackling it with robust and comprehensive legislation needs to be a priority for this Government.
"Tech firms need to be forced to get a grip of the abuse taking place on their sites by using technology to identify suspicious behaviour and designing young people's accounts with built-in protections."
The NSPCC said there are around 5,182,045 young people aged 11 to 17 in the UK and estimated that therefore around 201,696 had sent, received or been asked to send explicit messages or images.
The charity has warned paedophiles contact large numbers of children on social media and then encourage the ones who respond to move over to encrypted messaging or live streaming.
Those who are tricked into sending images, often after being threatened, can then be blackmailed into sending more.
Facebook's head of global safety Antigone Davis said: "Keeping young people safe on our platforms is a top priority for us.
"In addition to using technology to proactively detect grooming and prevent child sexual exploitation on our platform, we work with child protection experts, including specialist law enforcement teams like CEOP (Child Exploitation and Online Protection Command) in the UK, to keep young people safe.
"Ninety-nine percent of child nudity content is removed from our platform automatically."