Thousands of online grooming crimes in Scotland during past five years

  • More than 3,000 Communicating Indecently with a Child offences have been recorded by Police Scotland during the past five years
  • NSPCC urges Ofcom to significantly strengthen its approach to child sexual abuse and for the UK Government to ensure the regulator can tackle grooming in private messaging

Over 3,000 online grooming crimes across Scotland have been recorded by Police Scotland during the past five years, new data compiled by the NSPCC has revealed.   

The figures provided by Police Scotland show 3,234 Communicating Indecently with a Child offences were recorded since 2019, with 672 offences recorded last year (2023/24) – an increase of 13% from the previous year.  

The NSPCC has issued these findings a year on from the Online Safety Act being passed.

The charity is urging Ofcom to significantly strengthen the rules social media platforms must follow to tackle child sexual abuse on their products.

They say the regulator currently puts too much focus on acting after harm has taken place rather than being proactive to ensure the design features of social media apps are not contributing to abuse.

The NSPCC is also calling on the Government to strengthen legislation to ensure child sexual abuse is disrupted in private messages such as on Snapchat and WhatsApp.

The charity’s Voice of Online Youth young people’s group were not surprised at the prevalence of Snapchat in offences.

Liidia, 13 from Glasgow, said: “Snapchat has disappearing messages, and that makes it easier for people to hide things they shouldn’t be doing.

“Another problem is that Snapchat has this feature where you can show your location to everyone. If you’re not careful, you might end up showing where you are to people you don’t know, which is super risky.

“And honestly, not all the rules in Snapchat are strict, so some people take advantage of that to do bad things. Apps should have better ways for us to report bad things, and they should always get updated to protect us better with the latest security tech.”

Sir Peter Wanless, NSPCC Chief Executive, said: “One year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children.

“We need ambitious regulation by Ofcom who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.

“It is clear that much of this abuse is taking place in private messaging which is why we also need the UK Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.”

National Police Chief’s Council Lead for Child Protection and Abuse Investigations (CPAI) Becky Riggs said: “The numbers in this NSPCC data are shocking and policing joins partners in urging tech companies and Ofcom to fulfil their legal and moral obligations to keep children safe from harm within the online communities they have created.

“A year on from the Online Safety Act being passed, it is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow.

“Policing will not stop in its fight against those who commit these horrific crimes. We cannot do this alone, so while we continue to pursue and prosecute those who abuse and exploit children, we repeat our call for more to be done by companies in this space.”

Hundreds of children safeguarded as online abuse reports increase

Hundreds of children have been safeguarded by police enforcement as reports of online child sexual abuse increased during the last year, information released today by Police Scotland shows.

Police Scotland’s 2020-21 Quarter 4 Performance Report and Management Information showed there were a total of 1,966 child sexual abuse crimes recorded during the year, an increase of 5.9% compared to last year (1,857) and 24.9% greater than the five year average of 1,574.

The Performance Report outlines the safeguarding of 434 children through the enforcement of 649 National Online Child Abuse Prevention (NOCAP) packages between September 2020 and March this year.

NOCAP packages provide intelligence and evidence which underpins investigations carried out to identify and arrest online child abusers.

Deputy Chief Constable Fiona Taylor said: “The rise in reports online child sexual abuse has continued and accelerated during this period, and the Performance Report draws attention to vital work to safeguard hundreds of children through the enforcement of National Online Child Abuse Prevention (NOCAP) packages.

“Online child sexual abuse is a national threat and tackling it is a priority for Police Scotland. The implementation of our Cyber Strategy will ensure we continue to build capacity and capability to keep people safe in the virtual space.”

The reports also provide an insight into the effect of coronavirus restrictions on the policing needs and requirements of communities during 2020-21.

Crime reports fell overall, with 6,361 fewer violent crimes reported compared to the previous year, a decrease of 10% while there were also 55 fewer road fatalities, decreasing 33% from 165 to 110.

Acquisitive crime, such as housebreakings and shoplifting, fell by 18% year on year (from 109,460 to 89,731).

Detection rates increased in a number of crime categories where reported offences had decreased, including overall violent crime (up 3.3% points) and acquisitive crime (up 0.3% points).

However reported frauds increased by 38.2% from 10,875 in 2019-20 to 15,031 during the last year, up 78.1% on the five-year average of 8,439 reported crimes.

DCC Taylor said: “The reporting year 2020-21 was truly an exceptional period, covering from just a few days after the country first entered lockdown up until the beginning of April 2021.

“While it may be years before some of the changes to how people live their lives and to the nature of crime are fully understood, this information demonstrates the significant impact coronavirus restrictions have had on reported crime, detection rates and other policing requirements during this unique time.

“Overall violent crime reduced by around 10% year on year. Year on year increases of violent crime were reported during only the months of July and August, when restrictions had been eased.

“Acquisitive crime, such as shoplifting, also declined overall by almost a fifth compared to the year before and against the five-year average.

“The number of people killed and seriously injured on our roads is down about a third on the year before.”

“While this is to be welcomed, it is important to note reductions in reported crime did not occur in every category.

“As restrictions ease, we will continue to report on changes to the policing requirements of communities and the challenge of maintaining higher detection rates in the context of reported crime which is closer to pre-pandemic levels, as well increasing demand in areas such as fraud and online child abuse.”

An NSPCC Scotland spokesperson said: “These latest figures are further evidence of the increasing risk to children posed by child sexual offenders online.

“It is right and crucial that Police Scotland is tackling these crimes as a priority, through arresting suspects and working with partners to raise awareness of the issue. But it is clear we cannot continue with the status quo, where it’s left to law enforcement to tackle child abuse but social networks fail to do enough to proactively prevent and disrupt it from happening in the first place.

“The UK Government needs to deliver on its promise to put the protection of children front and centre of the Online Safety Bill, with tech firms being held to account if they fail in their duty of care.”

The 2020-21 Q4 Performance Report will be presented to the Scottish Police Authority’s Policing Performance Committee on Tuesday, 8 June.

The Performance Report and Management information can be found by clicking here https://www.scotland.police.uk/about-us/our-performance/

NSPCC aims to reset the debate on end-to-end encryption

  • Polling shows majority of adults in Scotland would back end-to-end encryption in private messaging if children’s safety is not compromised
  • NSPCC chief calls for a reset of the debate to protect the safety and privacy rights of children
  • Home Secretary to address NSPCC event on end-to-end encryption

The NSPCC is warning that private messaging is the frontline of child sexual abuse online and is calling for an urgent re-set of debates on end-to-end encryption.

The call comes as polling shows the Scottish public support for end-to-end encryption of private messages would double if platforms could demonstrate children’s safety would not be compromised.

An NSPCC/YouGov survey found 29% of adults in Scotland support using end-to-end encryption on social media and messaging services, but this jumps to 59% if it was rolled out only if and when tech firms can ensure children’s safety is protected.

A total of 183 adults in Scotland were surveyed between 31st December 2020 and 4th January 2021.

Major tech firms currently use a range of technology to identify child abuse images and detect grooming and sexual abuse in private messages.

But there are fears that Facebook’s proposals to end-to-end encrypt Facebook Messenger and Instagram would render these tools useless, with estimates that 70% of global child abuse reports could be lost.

In 2018 these reports resulted in 2,500 arrests and 3,000 children being safeguarded in the UK.

A major NSPCC roundtable attended by the UK Government Home Secretary, Priti Patel, will today (Monday) bring together child protection, civil society and law enforcement experts from the UK, US, Canada, Ireland and Australia.

The charity will call for an urgent reset of the debate around end-to-end encryption which they say has increasingly become an ‘either or’ argument skewed in favour of adult privacy over the safety and privacy rights of children.

However, latest polling suggests public support for a balanced settlement that protects the safety of children while maximising the privacy of all users –  including children who have been sexually abused.

  • More than half (52%) of adults in Scotland believe the ability to detect child abuse images is more important than the right to privacy and more than a third (39%) think they are equally important. Only 3% say privacy should be prioritised over safety.
  • 94% support social networks and messaging services having the technical ability to detect child abuse images on their sites.
  • 95% support a technical ability to detect adults sending sexual images to children on their services.

Sir Peter Wanless, NSPCC Chief Executive, said: “Private messaging is the frontline of child sexual abuse but the current debate around end-to-end encryption risks leaving children unprotected where there is most harm.

“The public want an end to rhetoric that heats up the issue but shines little light on a solution, so it’s in firms’ interests to find a fix that allows them to continue to use tech to disrupt abuse in an end-to-end encrypted world.

“We need a coordinated response across society, but ultimately the UK Government must be the guardrail that protects child users if tech companies choose to put them at risk with dangerous design choices.”

A re-set debate should focus on demonstrating the impact that end-to-end encryption will have on engineering away platforms’ ability to find abuse in private messaging, and how this can be avoided.

The current debate predominantly focuses on the impact of end-to-end encryption for law enforcement, which emphasises the investigation of abuse after it has already taken place – rather than focussing on the loss of platforms’ ability to detect and disrupt abuse much earlier. 

At the roundtable, the NSPCC will share new research and analysis about the implications of end-to-end encryption for child protection and call for tech firms to refocus their approach through safer design features and investment in technology.

It says tech firms should strive to achieve a new settlement that balances properly the benefits and risks of end-to-end encryption, underpinned by legal safeguards through regulation.

The NSPCC is calling for a reset of the debate that allows parties to reach a balanced settlement on both safety and privacy by:

  • Considering the needs of all users, including children
  • Avoiding characterising children’s safety as a simplistic trade off against adult’s privacy
  • Reflecting children’s digital rights under international law
  • Tech firms respecting the full range of fundamental rights at stake, rather than privileging some over others
  • Considering how particular design features can exacerbate the risk of end-to-end encryption to children – e.g. Facebook algorithms that suggest children as friends to adults and plans to auto delete messages on WhatsApp

The UK Government Home Secretary will address the meeting a year after the NSPCC brought together 130 children’s organisations to call on Facebook not to proceed with end-to-end encryption until they can guarantee children’s safety won’t be compromised.

The NSPCC’s report End-to-End Encryption: Understanding the Impacts for Child Safety Online compiled research and interviews with experts from 17 organisations in the UK, US and Australia, including industry, government, law enforcement, civil society and academics.

Its policy briefing Private messaging and the rollout of end-to-end encryption – the implications for child protectionsets out the importance of a range of responses to ensure child protection can be maintained in end-to-end encrypted environments, through technological, civil society and legislative and regulatory action.

Scottish adults support tough new laws and sanctions on tech firms to combat child abuse

  • Poll shows widescale public support for stronger legislation to protect children from abuse online
  • Comes as NSPCC report says UK Government’s Online Safety Bill must be more ambitious to comprehensively tackle sexual abuse
  • Charity chief calls for no compromise on children’s safety being at the heart of new laws

The Scottish public overwhelmingly back robust new laws to protect children from abuse on social media and wants bosses to be held responsible for safety, new polling suggests.

An NSPCC/YouGov survey found that more than nine in ten respondents (95%) in Scotland want social networks and messaging services to be designed to be safe for children.

The poll of more than 2,000 adults across the UK*, of which 179 respondents were from Scotland, shows huge support for putting a legal requirement on tech firms to detect and prevent child abuse, while backing strong sanctions against directors whose companies fail.

91% of respondents in Scotland want firms to have a legal responsibility to detect child abuse, such as grooming, taking place on their sites.

And almost four in five Scottish adults (79%) support prosecuting senior managers of social media managers if their companies consistently fail to protect children from abuse online, while 83% of respondents want social media bosses fined for consistent failures.

NSPCC Chief Executive Sir Peter Wanless said it shows a huge public consensus for robust Duty of Care regulation of social media.

He is urging the UK Culture Secretary Oliver Dowden to listen by ensuring his landmark Online Safety Bill convincingly tackles online child abuse and puts the onus on firms to prevent harm. He set out the UK Government’s vision for legislation in December.

The survey found that just ten per-cent of Scottish adults think sites are regularly designed safely for children, but 77% support a legal requirement for platforms to assess the risks of child abuse on their services, and take steps to address them.

It come as the NSPCC’s ‘Delivering a Duty of Care’ report, released earlier this week, assessed plans for UK legislation against its six tests for the UK Government to achieve bold and lasting protections for children online.

It found that UK Government is failing on a third of indicators (nine out of 27), with tougher measures needed to tackle sexual abuse and to give Ofcom the powers they need to develop and enforce regulation fit for decades to come.

Sir Peter Wanless said: “Today’s polling shows the clear public consensus for stronger legislation that hardwires child protection into how tech firms design their platforms.

“Mr Dowden will be judged on whether he takes decisions in the public interest and acts firmly on the side of children with legislation ambitious enough to protect them from avoidable harm.

“For too long children have been an afterthought for Big Tech but the Online Safety Bill can deliver a culture change by resetting industry standards and giving Ofcom the power to hold firms accountable for abuse failings.”

The NSPCC is calling for legislation to be more robust so it can successfully combat online child abuse at an early stage and before it spreads across platforms.

They want a requirement for tech firms to treat content that facilitates sexual abuse with the same severity as material that meets the criminal threshold.

This means clamping down on the “digital breadcrumbs” dropped by abusers to guide others towards illegal material. These include videos of children just moments before or after they are sexually abused – so-called ‘abuse image series’ – that are widely available on social media.

The charity also want Ofcom to be able to tackle cross platform risks, where groomers target children across the different sites and games they use – something firms have strongly resisted.

In its report, the NSPCC called on the UK Government to commit to senior management liability to make tech directors personally responsible for decisions on product safety.

They say this is vital to drive cultural change and provide an appropriate deterrent against a lax adoption of the rules.

The charity wants to see senior management liability similar to the successful approach in financial services. Under the scheme, bosses taking decisions which could put children at risk could face censure, fines and in the case of the most egregious breaches of the Duty of Care, criminal sanctions.

They warn that the UK Government has softened its ambition and at present just propose liability for narrow procedural reasons, which will only to be enacted later down the line.

The NSPCC has been the leading voice for social media regulation and the charity set out detailed proposals for a Bill in 2019.

The UK Government’s White Paper consultation response in December set out the framework for an Online Safety Bill that is expected in the Spring.

Online child grooming offences rise in Scotland

  • NSPCC asks Boris Johnson to publicly commit to having world-leading online harms legislation on statute book within 18 months
  • Social media sites are ‘enabling offenders’ as recorded crimes, which include online grooming offences, rise above 2,500 in five years in Scotland

Crimes of communicating a sexual message to a child have increased by more than 80 per cent in five years in Scotland, the NSPCC has revealed.

New figures obtained via a freedom of information request show that 651 offences of Communicating Indecently with a Child were recorded by Police Scotland in the last year, compared to 354 crimes in 2014/15 – an increase of 84 per cent.

In the year to April 2020, the rise was 12 per cent but the NSPCC is warning there could be a sharper increase this year due to the unique threats caused by coronavirus that are being exacerbated by years of industry failure to design basic child protection into platforms.

The charity is now calling on the UK Prime Minister to urgently press ahead with legislation that would help prevent offenders from using social media to target children for sexual abuse.

An analysis by the NSPCC of data of an equivalent crime from police forces in England and Wales has revealed that Facebook-owned apps were used in 55% of cases, from April 2017 to October 2019, where police recorded information about how a child was groomed.1 This data was not available from Police Scotland.

Emily* was 13 when she exchanged messages and photos with a man she believed to be 15 on Facebook and Snapchat. The man turned out to be 24 and sexually abused her.

Emily’s mum, Wendy*, said: “It’s important for social media to be regulated and for Facebook and Instagram to take more responsibility to keep the people who use their platform safe. All other businesses have a Duty of Care to keep children safe, so why not them?”

In February, then UK Government Digital Minister Matt Warman promised to publish an Online Harms Bill during the current UK parliamentary session following proposals set out in a White Paper.

These proposals set out independent regulation of social networks with potential criminal sanctions if tech directors fail to keep children living in the UK safe on their platforms.

However, frustration is growing at delays to the legislation with a full response to consultation on the White Paper not now expected until the end of the year and concerns we might not see a regulator until 2023.

This has been expressed by the chairs of both the UK Parliament Home Affairs and Digital, Culture, Media and Sport committees, which scrutinise the work of the UK Government departments responsible for online harms.

The NSPCC is calling on the UK Prime Minister to deliver an Online Harms Bill, that sets out a Duty of Care on tech firms to make their sites safer for children, within 18 months.

The charity wants his Government to publish a roadmap that sets out the timescales for a world-leading Bill to go through Westminster as a matter of urgency.

NSPCC Chief Executive Peter Wanless spoke to Boris Johnson at a hidden harms round table last week and highlighted how coronavirus had created a perfect storm for abusers because platforms hadn’t done enough to tackle safety risks going into the crisis. He urged the Prime Minister to ensure there is no unnecessary delay to legislation.

Mr Wanless said: “Child abuse is an inconvenient truth for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids.

“Last week the Prime Minister signalled to me his determination to stand up to Silicon Valley and make the UK the world leader in online safety. He can do this by committing to an Online Harms Bill that puts a legal Duty of Care on big tech to proactively identify and manage safety risks.

“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm.”

NSPCC says The Online Harms Bill should:

  • Enforce a Duty of Care on tech companies to identify and mitigate reasonably foreseeable risks on their platforms, including at the design stage, to proactively protect users from harm
  • Create a regulator that can hand out GDPR equivalent fines – up to 4% of global turnover – and hold named directors criminally accountable for the most serious breaches of their Duty of Care
  • Give the regulator robust powers to investigate companies and request information
  • Create a culture of transparency by legally compelling tech firms to disclose any breaches of the Duty of Care and major design changes to their platforms.

Online Abuse: kick hate crime out of football

As the Chair of Edinburgh and Lothians Regional Equality Council (ELREC), I and the organisation take any form of racism seriously.  It is deeply saddening to see  such awful language used in regards to the football game. There is absolutely no excuse for this language on or off the pitch or anywhere. 

Continue reading Online Abuse: kick hate crime out of football