9/10 children want tech laws to make them safer

Tech firms should protect children from sexual, self-harm, suicide and violent content online, say 91% of young people in an NSPCC survey.

The results also reveal that more than nine out of 10 children agree these companies should have a legal responsibility to keep them safe on their platforms.

These figures come as the NSPCC was joined by people whose lives have been deeply impacted by online harms, to hand in its Wild West Web campaign petition to 10 Downing Street yesterday.

NSPCC chief executive Peter Wanless is joined by Ruth Moss, Ian Russell and Andy Burrows (NSPCC head of online safety policy) to hand in the charity’s Wild West Web petition with almost 46,000 signatures, at 10 Downing Street

The petition has almost 46,000 signatures, with 4,405 coming from the South West, which shows the overwhelming support for the charity’s campaign for the Government to bring in a statutory regulator to force tech firms to better protect children online.

NSPCC chief executive Peter Wanless was joined by Ruth Moss and Ian Russell, who have supported the charity in its campaign and spoken out about their experiences in a bid to see change.

The charity is now calling on the next Prime Minister to act quickly in bringing together comprehensive legislation that will force tech firms to exercise a duty of care and protect children from abuse and harmful content on their platforms.

The NSPCC believes it is essential that the regulator has robust information disclosure and investigatory powers and that named directors of tech firms face personal liability for significant breaches of a company’s duty of care.

The charity’s latest survey, which involved 2,004 children aged 11 to 16, revealed that nine out of 10 said they had a social media account. More than half of those had accounts on Facebook, Instagram, WhatsApp, YouTube and Snapchat.

More than 90% agreed that social media platforms should protect them from:

  • Inappropriate content and behaviour (92%)
  • Violent content (91%)
  • Sexual content (92%)
  • Content about self-harm and suicide (91%)
  • Bullying (92%)

Peter Wanless, NSPCC Chief Executive, said: “In recent months we have seen the breadth of consensus for social networks to take proper responsibility for protecting children from abuse and harmful content on their platforms. The thousands of signatures on our Wild West Web campaign alone demonstrates the strength of feeling out there.

“But this latest research could not be clearer; children themselves want to go online without the fear of seeing graphic and disturbing material and being vulnerable to abuse.

“It is imperative that the new Prime Minister treats this issue as an utmost priority and that the Government now works swiftly to build on its bold and ambitious proposals and brings in legislation that will make the UK the safest place for children to be online.”

Ruth Moss, whose daughter Sophie took her own life at the age of 13 after looking at self-harm and suicide content on social media, said: “Children are protected by legislation in so many aspects of life, including traditional media. We would be horrified if our children were exposed to abuse or damaging imagery in films, television or the press, so why should the internet and social media be any different?

“And with WiFi available in so many public areas and most children having social media accounts, it would be naïve to think that parents can manage this issue alone. Even children themselves recognise this.

“Therefore, it’s essential that this regulation is implemented as a priority. We owe this to future generations of our children.”

The survey also revealed that more than eight in 10 children believe it is important for social media platforms to:

  • Make it easy to take down a post (93%)
  • Have tools to help children remove posts (91%)
  • Prioritise requests from children to remove content (89%)
  • Make it harder to share screenshotted content (80%)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.