Children ‘should be taught about the dangers of online sexual content’
The issue of sextortion and peer-on-peer abuse should form part of the school syllabus, says a report by Demos.
Schools should do more to teach children about the dangers of sharing sexual content online, a think tank report has recommended.
The scale of the problem means police and other law enforcement agencies should focus on those carrying out the abuse and making images rather than low-level offenders, the report suggested.
Artificial intelligence (AI) technology could also play a key role in tackling the problem of child sexual abuse images (CSAI), the paper by cross-party think tank Demos said.
In its report, which drew on evidence from experts, including industry watchdog the Internet Watch Foundation (IWF), Demos highlighted the growing problem of youngsters “sexting” – producing their own illegal material.
One-fifth of reported images in 2015 were “self-generated”, around 16% of young people aged 11-16 have reported sending sexual images in the UK and one in six people reported to the police for indecent images are minors, according to research referenced in the report.
The Demos report said the issues should be part of the personal, social, health and economic (PHSE) curriculum in schools.
“The pitfalls of sharing content online, including sexual content, should form part of the syllabus,” the report said.
“As the amount of self-produced illegal content continues to increase, stopping this at its source is the only sensible response.
“Education is needed to support potential victims and perpetrators of sextortion and peer-on-peer abuse, a growing problem. ”
Children’s charity NSPCC has estimated that up to 590,000 men in the UK may have viewed CSAI and the Demos report said the scale of the problem meant police should target “sophisticated” offenders at the top of the criminal pyramid – abusers and those who made the images.
“This is not being ‘soft on paedophiles’, but rather a sensible way of targeting limited resources,” the report said.
The use of AI and “deep learning” has great potential in spotting CSAI before it can be shared and identifying victims, but the pace of technological change meant it requires continued investment.
According to the IWF, less than 0.1% of identified CSAI content is hosted in the UK, down from 18% in 1996.
Some 60% of the material it identified was from Europe and 37% from North America.
IWF chairman Andrew Puddephatt will tell a meeting in Westminster that the UK is a world leader in the fight against CSAI.
“The model of independent self-regulation, which has been pioneered in the UK by the IWF, is working,” he said.
“There is always a debate about how far governments should intervene in the oversight and regulation of the internet but the fight against child sexual abuse imagery shows how much can be achieved when the industry works together with everyday internet users.”
A separate paper, produced by the National Centre for Social Research (NatCen) found that perpetrators of online-facilitated abuse are “generally male, white, young, educated, intelligent, employed” and with less prior criminal history than “contact” offenders.
The report, drawn up for the Independent Inquiry into Child Sexual abuse, found that victims of online abuse were likely to be vulnerable teenagers and many police feel unprepared to investigate such cases.
Jeffrey DeMarco, research director at NatCen, said parents had a key role to play in preventing potential abuse.
“What really stands out is that, by creating an open dialogue with their child, parents can help to prevent them turning to strangers online for reassurance,” he said.
“As new technology and platforms become the norm, everyone involved with safeguarding children, from parents to police to internet service providers, needs to ensure their knowledge is up-to-date and relevant to the contemporary online landscape.”