Thousands of online grooming crimes in Scotland during past five years

  • NSPCC publishes new research highlighting a range of tools tech companies, Ofcom, and Government can employ to protect children from perpetrators

More than 3,000 online grooming crimes have been recorded by Police Scotland during the past five years, new data compiled by the NSPCC reveals.   

The figures published by the Scottish Government show that 3,158 Communicating Indecently with a Child offences have been recorded since 2020.   

The charity points out that while these are the offences recorded by police; the real number of crimes is likely to be much higher due to abuse happening in private spaces where harms can be harder to detect.

The NSPCC is highlighting these offences as it publishes new research to tackle this issue – it sets out solutions that can be used to prevent, detect and disrupt grooming in private messaging spaces. Online child sexual abuse crimes can have a long-term impact on a child, leaving them with feelings of guilt, shame, depression, confusion, anxiety and fear.

One 14-year-old who contacted Childline said: “I feel so insecure all the time, so, when this guy I’ve met online, who’s a few years older, started flirting with me, that made me feel so special.

“He seemed to care, but now he’s insisting I send him nudes, and I don’t know if he just gave me attention, so I’d send him nudes. I feel like I’ve been tricked but I’m afraid what he might do if I just block him.

“I can’t control how anxious this makes me feel.”

The charity’s new research identifies cycles of behaviours that perpetrators use, such as creating multiple different profiles and manipulating young users to engage with them across different platforms.

In response, the NSPCC is urging Ofcom and tech companies to take swift action on the recommendations set out in the report, so that they can better identify and prevent online grooming.

Recommendations include:

  • Implementing tools on a child’s phone that can scan for nude images and identify child sexual abuse material, before its shared.
  • Using metadata analysis, which uses background information, like when, where, and how someone is using a platform, to spot suspicious patterns. It does not read private messages, but it can flag behaviours that suggest grooming, such as adults repeatedly contacting large numbers of children or creating fake profiles.
  • Create barriers for adult profiles engaging children on social media platforms, like restrictions on who they can search and how many people they can contact.
  • Tech platform leaders should commit to delivering services which effectively support and balance user safety and privacy.

The research shows that safety measures must be introduced at the same time to be effective, working in tandem to ensure harm is prevented across the grooming cycle.

The NSPCC is urging tech companies, Ofcom, and Government to take leadership on addressing this devastating crime and commit to using every tool available to them to stop perpetrators in their tracks.

Chris Sherwood, NSPCC Chief Executive, said: “At Childline, we hear first-hand how grooming can devastate young lives. The trauma doesn’t end when the messages stop, it can leave children battling anxiety, depression, and shame for years.

“Tech companies must act now to prevent further escalation. The tools the NSPCC sets out to protect children are ready to use and urgently needed. Importantly, they mean that services can keep children safe while protecting all user’s privacy. Children’s safety must be built into platform design from the start, not treated as an afterthought.”

Kerry Smith, Chief Executive of the Internet Watch Foundation (IWF) said: “The internet has opened a door into millions of homes, giving predators access to children.

“Safety should be something which is built into all services and platforms from the bottom up, not tacked on as an afterthought. There should be absolutely nowhere for predators to hide online.

“Tech companies must do everything they can, including in end-to-end encrypted spaces, to keep children safe. It is clear now that this can be done effectively without compromising users’ privacy. There really is no excuse – and the alternative is allowing children to continue to suffer.”