- More than 7,500 crimes in England and Wales recorded in just two years of new offence coming into force
- Over one in five offences were against children aged 11 or under
- Police figures show instances of grooming on Instagram have doubled
- Over 100 offences were recorded in Devon and Cornwall last year.
- Legislation to protect children online must be prioritised
Grooming crimes recorded by police in Devon and Cornwall have almost doubled in the last year, data obtained by the NSPCC has revealed.
There were 109 offences of sexual communication with a child recorded by Devon and Cornwall Police in the year to April 2019 compared with 58 in the previous year.
In England and Wales, there were 4,373 offences of sexual communication with a child recorded in the year to April 2019 compared with 3,217 in the previous year. The offence came into force on April 3, 2017, following an NSPCC campaign.
The data obtained from 43 police forces in England and Wales under Freedom of Information laws also revealed that, where age was provided, one in five victims were aged just 11 or younger.
In 2018/19 in England and Wales the number of recorded instances of the use of Instagram, which is owned by Facebook, was more than double that of the previous year. If children are on Instagram, parents need to monitor privacy and content.
Overall in the last two years nationally, Facebook-owned apps (Facebook, Messenger, Instagram, WhatsApp) and Snapchat were used in more than 70% of the instances where police recorded and provided the communication method.
Instagram was used in more than a quarter of them.
The Government has indicated it will publish a draft Online Harms Bill early next year, following the NSPCC’s Wild West Web campaign. The proposals would introduce independent regulation of social networks, with tough sanctions if they fail to keep children safe on their platforms.
The NSPCC believes it is now crucial that Boris Johnson’s Government makes a public commitment to draw up these Online Harms laws and implement robust regulation for tech firms to force them to protect children as a matter of urgency.
Peter Wanless, NSPCC Chief Executive, said: “It’s now clearer than ever that Government has no time to lose in getting tough on these tech firms.
“Despite the huge amount of pressure that social networks have come under to put basic protections in place, children are being groomed and abused on their platforms every single day. These figures are yet more evidence that social networks simply won’t act unless they are forced to by law. The Government needs to stand firm and bring in regulation without delay.”
Freya* was 12 when, while she was staying at a friend’s house, a stranger bombarded her Instagram account with sexual messages and videos.
Her mum Pippa* told the NSPCC: “She was quiet and seemed on edge when she came home the next day. I noticed her shaking and knew there was something wrong so encouraged her to tell me what the problem was.
“When she showed me the messages, I just felt sick. It was such a violation and he was so persistent. He knew she was 12, but he kept bombarding her with texts and explicit videos and images. Freya* didn’t even understand what she was looking at. There were pages and pages of messages, he just didn’t give up.
“Our children should be safe in their bedrooms, but they’re not. They should be safe from messages from strangers if their accounts are on private, but they’re not.”
The NSPCC’s Wild West Web campaign is calling for social media regulation to require platforms to:
- Take proactive action to identify and prevent grooming on their sites by:
- Using Artificial Intelligence to detect suspicious behaviour
- Sharing data with other platforms to better understand the methods offenders use and flag suspicious accounts
- Turning off friend suggestion algorithms for children and young people, as they make it easier for groomers to identify and target children
- Design young people’s accounts with the highest privacy settings, such as geo-locators off by default, contact details being private and unsearchable and live streaming limited to contacts only.
The charity wants to see tough sanctions for tech firms that fail to protect their young users – including steep fines for companies, boardroom bans for directors, and a new criminal offence for platforms that commit gross breaches of the duty of care.