Online child grooming offences rise in Scotland

  • NSPCC asks Boris Johnson to publicly commit to having world-leading online harms legislation on statute book within 18 months
  • Social media sites are ‘enabling offenders’ as recorded crimes, which include online grooming offences, rise above 2,500 in five years in Scotland

Crimes of communicating a sexual message to a child have increased by more than 80 per cent in five years in Scotland, the NSPCC has revealed.

New figures obtained via a freedom of information request show that 651 offences of Communicating Indecently with a Child were recorded by Police Scotland in the last year, compared to 354 crimes in 2014/15 – an increase of 84 per cent.

In the year to April 2020, the rise was 12 per cent but the NSPCC is warning there could be a sharper increase this year due to the unique threats caused by coronavirus that are being exacerbated by years of industry failure to design basic child protection into platforms.

The charity is now calling on the UK Prime Minister to urgently press ahead with legislation that would help prevent offenders from using social media to target children for sexual abuse.

An analysis by the NSPCC of data of an equivalent crime from police forces in England and Wales has revealed that Facebook-owned apps were used in 55% of cases, from April 2017 to October 2019, where police recorded information about how a child was groomed.1 This data was not available from Police Scotland.

Emily* was 13 when she exchanged messages and photos with a man she believed to be 15 on Facebook and Snapchat. The man turned out to be 24 and sexually abused her.

Emily’s mum, Wendy*, said: “It’s important for social media to be regulated and for Facebook and Instagram to take more responsibility to keep the people who use their platform safe. All other businesses have a Duty of Care to keep children safe, so why not them?”

In February, then UK Government Digital Minister Matt Warman promised to publish an Online Harms Bill during the current UK parliamentary session following proposals set out in a White Paper.

These proposals set out independent regulation of social networks with potential criminal sanctions if tech directors fail to keep children living in the UK safe on their platforms.

However, frustration is growing at delays to the legislation with a full response to consultation on the White Paper not now expected until the end of the year and concerns we might not see a regulator until 2023.

This has been expressed by the chairs of both the UK Parliament Home Affairs and Digital, Culture, Media and Sport committees, which scrutinise the work of the UK Government departments responsible for online harms.

The NSPCC is calling on the UK Prime Minister to deliver an Online Harms Bill, that sets out a Duty of Care on tech firms to make their sites safer for children, within 18 months.

The charity wants his Government to publish a roadmap that sets out the timescales for a world-leading Bill to go through Westminster as a matter of urgency.

NSPCC Chief Executive Peter Wanless spoke to Boris Johnson at a hidden harms round table last week and highlighted how coronavirus had created a perfect storm for abusers because platforms hadn’t done enough to tackle safety risks going into the crisis. He urged the Prime Minister to ensure there is no unnecessary delay to legislation.

Mr Wanless said: “Child abuse is an inconvenient truth for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids.

“Last week the Prime Minister signalled to me his determination to stand up to Silicon Valley and make the UK the world leader in online safety. He can do this by committing to an Online Harms Bill that puts a legal Duty of Care on big tech to proactively identify and manage safety risks.

“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm.”

NSPCC says The Online Harms Bill should:

  • Enforce a Duty of Care on tech companies to identify and mitigate reasonably foreseeable risks on their platforms, including at the design stage, to proactively protect users from harm
  • Create a regulator that can hand out GDPR equivalent fines – up to 4% of global turnover – and hold named directors criminally accountable for the most serious breaches of their Duty of Care
  • Give the regulator robust powers to investigate companies and request information
  • Create a culture of transparency by legally compelling tech firms to disclose any breaches of the Duty of Care and major design changes to their platforms.
Please follow and like NEN:
error25
fb-share-icon0
Tweet 20

Published by

davepickering

Edinburgh reporter and photographer