Thousands of online grooming crimes in Scotland during past five years

  • NSPCC publishes new research highlighting a range of tools tech companies, Ofcom, and Government can employ to protect children from perpetrators

More than 3,000 online grooming crimes have been recorded by Police Scotland during the past five years, new data compiled by the NSPCC reveals.   

The figures published by the Scottish Government show that 3,158 Communicating Indecently with a Child offences have been recorded since 2020.   

The charity points out that while these are the offences recorded by police; the real number of crimes is likely to be much higher due to abuse happening in private spaces where harms can be harder to detect.

The NSPCC is highlighting these offences as it publishes new research to tackle this issue – it sets out solutions that can be used to prevent, detect and disrupt grooming in private messaging spaces. Online child sexual abuse crimes can have a long-term impact on a child, leaving them with feelings of guilt, shame, depression, confusion, anxiety and fear.

One 14-year-old who contacted Childline said: “I feel so insecure all the time, so, when this guy I’ve met online, who’s a few years older, started flirting with me, that made me feel so special.

“He seemed to care, but now he’s insisting I send him nudes, and I don’t know if he just gave me attention, so I’d send him nudes. I feel like I’ve been tricked but I’m afraid what he might do if I just block him.

“I can’t control how anxious this makes me feel.”

The charity’s new research identifies cycles of behaviours that perpetrators use, such as creating multiple different profiles and manipulating young users to engage with them across different platforms.

In response, the NSPCC is urging Ofcom and tech companies to take swift action on the recommendations set out in the report, so that they can better identify and prevent online grooming.

Recommendations include:

  • Implementing tools on a child’s phone that can scan for nude images and identify child sexual abuse material, before its shared.
  • Using metadata analysis, which uses background information, like when, where, and how someone is using a platform, to spot suspicious patterns. It does not read private messages, but it can flag behaviours that suggest grooming, such as adults repeatedly contacting large numbers of children or creating fake profiles.
  • Create barriers for adult profiles engaging children on social media platforms, like restrictions on who they can search and how many people they can contact.
  • Tech platform leaders should commit to delivering services which effectively support and balance user safety and privacy.

The research shows that safety measures must be introduced at the same time to be effective, working in tandem to ensure harm is prevented across the grooming cycle.

The NSPCC is urging tech companies, Ofcom, and Government to take leadership on addressing this devastating crime and commit to using every tool available to them to stop perpetrators in their tracks.

Chris Sherwood, NSPCC Chief Executive, said: “At Childline, we hear first-hand how grooming can devastate young lives. The trauma doesn’t end when the messages stop, it can leave children battling anxiety, depression, and shame for years.

“Tech companies must act now to prevent further escalation. The tools the NSPCC sets out to protect children are ready to use and urgently needed. Importantly, they mean that services can keep children safe while protecting all user’s privacy. Children’s safety must be built into platform design from the start, not treated as an afterthought.”

Kerry Smith, Chief Executive of the Internet Watch Foundation (IWF) said: “The internet has opened a door into millions of homes, giving predators access to children.

“Safety should be something which is built into all services and platforms from the bottom up, not tacked on as an afterthought. There should be absolutely nowhere for predators to hide online.

“Tech companies must do everything they can, including in end-to-end encrypted spaces, to keep children safe. It is clear now that this can be done effectively without compromising users’ privacy. There really is no excuse – and the alternative is allowing children to continue to suffer.”

More than 3,500 online grooming crimes against children recorded by Police Scotland while safety laws discussed

  • NSPCC urges tech companies and MPs to back Online Safety Bill following new research on scale of online grooming
  • Primary school children targeted in more than half of online grooming crimes in Scotland since social media regulation was first demanded

More than 3,500 online grooming crimes have been recorded by Police Scotland while children have been waiting for online safety laws, new figures published by the NSPCC reveal today.

Data from Police Scotland shows 593 Communicating Indecently with a Child offences were recorded last year (2022/23).

The new research shows that in Scotland, 1,873 offences took place against primary school children, with under-13s making up more than half of victims.

The new analysis of the scale of child sexual abuse taking place on social media comes ahead of MPs and Lords making final decisions on the Online Safety Bill next month.

The NSPCC first called for social media regulation to protect children from sexual abuse in 2017 and has been campaigning for robust legislation ever since.

The charity said the number of offences is likely to be far higher than those known to police. In response, they are urging politicians on all sides to support the Bill in its final stages and pass this vital legislation.

Aoife (19) from East Kilbride, South Lanarkshire, was exploited online when she was 15 by an adult male who pretended to be a teenager.

The man convinced her to send him images of herself and blackmailed her with these to control her behaviour. When his demands became increasingly intense and frightening, Aoife plucked up the courage to tell her mum and teachers, who helped them to report it to the police.

Aoife said: “When I found out I’d been talking to an older man I was petrified. I remember it was 3am and I was sitting in my room, just shaking. I felt like I was the only person in the world and started crying.

“I wanted my mum, and while she was just in the room next door I thought I couldn’t tell her because it’s so embarrassing, but all I wanted was a hug from her.”

A draft Online Safety Bill was published over two years ago but regulation was first promised by Government in 2018 following the NSPCC’s call for action and the launch of its Wild West Web campaign.

The charity has been campaigning for strong legislation ever since, working closely with survivors, Government, Parliamentarians, and other civil society groups to ensure it effectively tackles the way social media and gaming sites contribute to child sexual abuse.

The legislation will mean tech companies have a legal duty of care for young users and must assess their products for child abuse risks and put mitigations in place to protect children.

It will give the regulator Ofcom powers to address significant abuse taking place in private messaging and require companies to put safeguards in place to identify and disrupt abuse in end-to-end encrypted environments.

The NSPCC said these measures are vital to effectively protect children from the most insidious abuse and recent polling shows they are backed by more than seven in ten voters.

Sir Peter Wanless, NSPCC Chief Executive said: “Today’s research highlights the sheer scale of child abuse happening on social media and the human cost of fundamentally unsafe products.

“The number of offences must serve as a reminder of why the Online Safety Bill is so important and why the ground-breaking protections it will give children are desperately needed.

“We’re pleased the Government has listened and strengthened the legislation so companies must tackle how their sites contribute to child sexual abuse in a tough but proportionate way, including in private messaging.

“It’s now up to tech firms, including those highlighted by these stark figures today, to make sure their current sites and future services do not put children at unacceptable risk of abuse.”

As well as winning the commitment to legislate, the NSPCC has helped shape significant gains for children in the Online Safety Bill as it has passed through Parliament, including:

  • Senior tech bosses will be held criminally liable for significant failures that put children at risk of sexual abuse and other harm.
  • Girls will be given specific protections as Ofcom will produce guidance on tackling Violence Against Women and Girls for companies to follow.
  • Companies will have to crack down on so-called tribute pages and breadcrumbing that use legal but often stolen images of children and child accounts to form networks of offenders to facilitate child sexual abuse.
  • Sites will have to consider how grooming pathways travel across various social media apps and games and work together to prevent abuse spreading across different platforms.

The NSPCC is still seeking assurances that the legislation will effectively regulate AI and immersive technology and wants an online child safety advocacy body specifically to speak with and for children as part of the day-to-day regulatory regime. They argue that this will help spot emerging risks and fight for the interests and safety of children before tragedies arise.

The charity are asking campaigners to reach out to MPs with personal messages about why they should act to make the online world safer for children and pass a robust Online Safety Bill in the coming weeks.