Child sexual abuse image crimes at record high in Scotland last year

  • Child sexual abuse image offences recorded by Police Scotland increased by 15 per cent between April 2022 and March 2023
  • NSPCC wants robust implementation of the Online Safety Act with Ofcom encouraged to strengthen its approach to tackling child sexual abuse
  • Meta urged to pause rollout of end-to-end encryption until plans for Facebook and Instagram can be risk assessed under new online safety regulations

The number of child sexual abuse image offences recorded by Police Scotland were at a record high last year – up by 15 per cent from the previous year, data analysed by the NSPCC has revealed.

A total of 765 offences where child abuse images were collected and distributed, were logged in 2022/23 according to Police Scotland data 1.  

Since 2017/18, when the NSPCC first called for social media regulation, a total of 3,877 crimes have been recorded while children and families have waited for online safety laws.

The charity said the figures show the need for swift and ambitious action by tech companies to address what is currently happening on their platforms and for Ofcom to significantly strengthen its approach to tackling child sexual abuse through effective enforcement of the Online Safety Act.

The figures come as insight from Childline shows young people being targeted by adults to share child sexual abuse images via social media and the calculated use of end-to-end encrypted private messaging apps by adults to find and share child abuse images.

A 14-year-old girl told the NSPCC-run service: “One night I got chatting with this guy online who I’d never met and he made me feel so good about myself. He told me he was 15, even though deep down I didn’t believe him.

“I sent him a couple of semi-nudes on Snap(chat), but then instantly regretted it. I asked him to delete the pics, but he just kept on making me do stuff for him not to post them – like getting me to strip live on camera. I just want to block him, but if I block him he will just post the pictures.”

A 15-year-old boy told Childline: “A while ago I saw a video on YouTube about how a guy was busting paedophiles and creeps on the internet by pretending to be a kid, and I kind of wanted to do a similar thing.

“I looked around Instagram for the creepiest accounts about kids my age and younger. In the end, I came across this link on one of their stories. It’s a link to a WhatsApp group chat in which [child sexual abuse material] is sent daily! There are literally hundreds of members in this group chat and they’re always calling the kids ‘hot’ and just being disgusting.”

  1. Police Scotland recorded crime data on the Scottish Government website.
Police Force2017/182018/192019/202020/212021/222022/23Total
Scotland6585545846606627653877

Online Safety Act implementation

The NSPCC said that disrupting online child sexual abuse taking place at increasing levels will require regulated tech platforms to introduce systemic changes to their products to stop them being used to organise, commit, and share child abuse.

A consultation into Ofcom’s first codes for companies to adopt to disrupt child sexual abuse on their platforms closed last week.

The NSPCC want these measures introduced without delay but urged Ofcom to begin work on a second version of the codes that will require companies to go much further.

The charity said companies should be required to use technology that can help identify and tackle grooming, sextortion and new child abuse images.

They also want tougher measures for private messaging services to make child protection a priority, including in end-to-end encrypted environments.

The NSPCC warned that Meta’s roll-out of end-to-end encryption on Facebook and Instagram will prevent authorities from identifying offenders and safeguarding victims.

The charity wants plans paused until Meta can prove child safety will not be compromised and have urged parties to find a balance between the safety and privacy of all users, including children. The NSPCC said further rollout should be delayed until Ofcom can study Meta’s risk assessment as part of the new regulatory regime.

Sir Peter Wanless, NSPCC Chief Executive, said: “It’s alarming to see online child abuse continue to rise, especially when tech companies should be acting to make their sites safe by design ahead of incoming regulation.

“Behind these crimes are children who have been targeted by adults who are able to organise and share sexual abuse with other offenders seamlessly across social media and messaging apps.

“The Online Safety Act sets out robust measures to make children fundamentally safer on the sites and apps they use so they can enjoy the benefits of a healthy online experience.

“Ofcom has been quick off the blocks but must act with greater ambition to ensure companies prioritise child safety in the comprehensive way that is so desperately needed.”

Susie Hargreaves OBE, Chief Executive of the Internet Watch Foundation, the UK’s front line against child sexual abuse imagery online, said: “This is a truly disturbing picture, and a reflection of the growing scale of the availability, and demand, for images and videos of children suffering sexual abuse.

“The people viewing and sharing and distributing this material need to know it is not a victimless crime. They are real children, suffering real abuse and sexual torture, the effects of which can linger a lifetime.

“That more and more people are trying to share and spread this material shows we should all be doing everything we can to stop this, building more, and innovative solutions to keep children safe.

“The IWF is ready to support technology companies and Ofcom in implementing the Online Safety Act to help make the UK the safest place in the world to be online.”

Scots mum celebrates Online Safety Act becoming law

  • The Online Safety Act became law last week
  • Abuse survivors, young people and bereaved families – including Scottish mum Ruth Moss, join NSPCC Chief Executive and head of Ofcom to mark momentous achievement
  • NSPCC says children and young people must be central to implementation and call for bold and ambitious regulation
  • Pictures available of installation outside Parliament thanking 147,000 campaigners who backed the legislation

Young people, abuse survivors and bereaved families celebrated the Online Safety Act becoming law at an NSPCC reception in Parliament yesterday.

After years of campaigning, legislation that will put a legal duty on tech companies to protect children from sexual abuse and harmful material on social media sites, gaming apps and messaging services was given Royal Assent on Thursday.

Ruth Moss, a member of the Bereaved Families for Online Safety – who were integral in achieving stronger protection for children in the legislation, joined fellow campaigners who have worked tirelessly for new laws that will protect children online, politicians, civil society and regulators to welcome the legislation.

Ruth’s daughter Sophie Parkinson tragically took her own life in March 2014 when she was only 13 years old after she was exposed to harmful content online.

Sophie was self-harming and viewing self-harm and suicide content from the age of 12. She had also had online relationships with older men and looked at violent pornography.

Ruth, a nurse from Dalkeith near Edinburgh, said: “For at least two years, we struggled to keep Sophie safe online.

“In spite of removing devices, restricting internet use, implementing parental controls and having conversations about internet safety, these were not enough to prevent her from being exposed to content that promoted self-harm, suicide and dark, graphic, harmful material. She managed to view violent pornography and have online conversations with adult male strangers.

“Complaining to internet and social media companies was either impossible or futile. As parents, it felt like one step forward and two steps back, especially when balancing her need to use the internet for school and controlling her use. We quickly realised that parents alone can’t control what their child sees on the internet.

“The impact of Sophie viewing this harmful material was a deterioration in her existing mental health struggles, with devastating consequences. We will never truly recover from her death, and it is rightly every parent’s worst nightmare.

“This legislation is a good first step. It sends a message to tech companies that safety should not be compromised for the sake of profit and that tech companies can’t deny responsibility for keeping their service users safe on their websites.

“In my opinion, the enforcement of the bill is key. This will be challenging. It will require Ofcom going up against some of the most powerful and influential organisations in the world. Ofcom will have a difficult job.”

Young people who campaigned with the NSPCC and a mum whose daughter was sexually abused on social media, spoke at the event. They joined the NSPCC Chief Executive Sir Peter Wanless and Ofcom Chief Executive Dame Melanie Dawes who will be responsible for the Act’s implementation.

The event was chaired by Sajid Javid MP who was Home Secretary when regulation to help protect children online was first promised by the Government in 2018, following the launch of the NSPCC’s Wild West Web campaign.

At the reception there was a clear focus on ensuring young people’s voices and experiences are central to the implementation of the Online Safety Act, so it results in meaningful change for children as soon as possible.

The event came as Ofcom prepares to set out the rules which tech companies will have to follow to tackle child sexual abuse and protect children from harmful material.

The codes of practice will be consulted on before being implemented, but those at the event were united in their expectation that tech companies should not wait to begin putting concrete measures in place to make their sites safe by design for children.

Sir Peter Wanless, NSPCC Chief Executive, said: “It was an honour to bring together so many people from different walks of life whose campaigning and dedication has helped make the Online Safety Act possible.

“I want to particularly thank everyone who has been impacted by online abuse and unimaginable harm who have campaigned selflessly to help protect others.

“While we rightly celebrated today, it was also a catalyst towards working together to ensure the legislation results in the protections online children desperately need.

“We look forward to seeing the landmark Act implemented with bold ambition to ensure there is a rigorous focus on children as regulation comes into force.”

Dame Melanie Dawes, Ofcom’s Chief Executive, said: We’re grateful for all the hard work that went into getting these new laws onto the statute books. It means we now have regulation that will make a real difference in creating a safer life online for people – particularly children – in the UK. This is a big job, and we’re ready, but we won’t be doing it alone.

 “Young people’s voices have shaped the foundations, and now we want to hear from them again to make sure we get the technical detail right. Next week, we’ll be consulting on the specific measures that tech firms can take to protect their users from illegal harms online, including child sexual abuse and grooming, and pro-suicide content.”

Technology Secretary Michelle Donelan said: “I am immensely proud of the work that has gone into the Online Safety Act from its very inception to it becoming law.

“At the heart of this Act is the protection of children. I would like to thank the campaigners, parliamentarians, survivors of abuse and charities including the NSPCC, that have worked tirelessly, not only to get this Act over the finishing line, but to ensure that it will make the UK the safest place to be online in the world.”

Many volunteers with lived experience of abuse who have campaigned with the NSPCC for robust legislation also joined the reception.

The NSPCC set up an installation outside the Houses of Parliament thanking the over 147,000 campaigners who backed the legislation. The charity has released a video with young people welcoming the Online Safety Act.

UK children and adults safer online as ‘world-leading’ bill becomes law

Online Safety Act receives Royal Assent putting rules to make the UK the safest place in the world to be online into law

  • Online Safety Act receives Royal Assent in the Houses of Parliament, putting rules to make the UK the safest place in the world to be online into law
  • the Act makes social media companies keep the internet safe for children and give adults more choice over what they see online
  • Ofcom will immediately begin work on tackling illegal content and protecting children’s safety

The Online Safety Act has today (Thursday 26 October) received Royal Assent, heralding a new era of internet safety and choice by placing world-first legal duties on social media platforms.

The new laws take a zero-tolerance approach to protecting children from online harm, while empowering adults with more choices over what they see online. This follows rigorous scrutiny and extensive debate within both the House of Commons and the House of Lords.

The Act places legal responsibility on tech companies to prevent and rapidly remove illegal content, like terrorism and revenge pornography. They will also have to stop children seeing material that is harmful to them such as bullying, content promoting self-harm and eating disorders, and pornography.

If they fail to comply with the rules, they will face significant fines that could reach billions of pounds, and if they don’t take steps required by Ofcom to protect children, their bosses could even face prison.

Technology Secretary Michelle Donelan said: “Today will go down as an historic moment that ensures the online safety of British society not only now, but for decades to come.

“I am immensely proud of the work that has gone into the Online Safety Act from its very inception to it becoming law today. The Bill protects free speech, empowers adults and will ensure that platforms remove illegal content.

“At the heart of this Bill, however, is the protection of children. I would like to thank the campaigners, parliamentarians, survivors of abuse and charities that have worked tirelessly, not only to get this Act over the finishing line, but to ensure that it will make the UK the safest place to be online in the world.”

The Act takes a zero-tolerance approach to protecting children by making sure the buck stops with social media platforms for content they host. It does this by making sure they:

  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content including pornographic content, content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, content depicting or encouraging serious violence or bullying content
  • enforce age limits and use age-checking measures on platforms where content harmful to children is published
  • ensure social media platforms are more transparent about the risks and dangers posed to children on their sites, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

Home Secretary Suella Braverman said: “This landmark law sends a clear message to criminals – whether it’s on our streets, behind closed doors or in far flung corners of the internet, there will be no hiding place for their vile crimes.

“The Online Safety Act’s strongest protections are for children. Social media companies will be held to account for the appalling scale of child sexual abuse occurring on their platforms and our children will be safer.

“We are determined to combat the evil of child sexual exploitation wherever it is found, and this Act is a big step forward.”

Lord Chancellor and Secretary of State for Justice, Alex Chalk said: “No-one should be afraid of what they or their children might see online so our reforms will make the internet a safer place for everyone.

“Trolls who encourage serious self-harm, cyberflash or share intimate images without consent now face the very real prospect of time behind bars, helping protect women and girls who are disproportionately impacted by these cowardly crimes.”

In addition to protecting children, the Act also empowers adults to have better control of what they see online. It provides 3 layers of protection for internet users which will:

  1. make sure illegal content is removed
  2. enforce the promises social media platforms make to users when they sign up, through terms and conditions
  3. offer users the option to filter out content, such as online abuse, that they do not want to see

If social media platforms do not comply with these rules, Ofcom could fine them up to £18 million or 10% of their global annual revenue, whichever is biggest – meaning fines handed down to the biggest platforms could reach billions of pounds.

The government also strengthened provisions to address violence against women and girls. Through the Act, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.

The change in laws also now make it easier to charge abusers who share intimate images and put more offenders behind bars. Criminals found guilty of this base offence will face up to 6 months in prison, but those who threaten to share such images, or shares them with the intent to cause distress, alarm or humiliation, or to obtain sexual gratification, could face up to two years behind bars.

NSPCC Chief Executive, Sir Peter Wanless said: “Having an Online Safety Act on the statute book is a watershed moment and will mean that children up and down the UK are fundamentally safer in their everyday lives.

“Thanks to the incredible campaigning of abuse survivors and young people and the dedicated hard work of Parliamentarians and Ministers, tech companies will be legally compelled to protect children from sexual abuse and avoidable harm.

T”he NSPCC will continue to ensure there is a rigorous focus on children by everyone involved in regulation. Companies should be acting now, because the ultimate penalties for failure will be eye watering fines and, crucially, criminal sanctions.”

Dame Melanie Dawes, Ofcom Chief Executive, said: “These new laws give Ofcom the power to start making a difference in creating a safer life online for children and adults in the UK. We’ve already trained and hired expert teams with experience across the online sector, and today we’re setting out a clear timeline for holding tech firms to account.   

“Ofcom is not a censor, and our new powers are not about taking content down. Our job is to tackle the root causes of harm. We will set new standards online, making sure sites and apps are safer by design. Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression.

“We know a safer life online cannot be achieved overnight; but Ofcom is ready to meet the scale and urgency of the challenge.”

In anticipation of the Bill coming into force, many social media companies have already started making changes. TikTok has implemented stronger age verification on their platforms, while Snapchat has started removing the accounts of underage users.

While the Bill has travelled through Parliament, the government has worked closely with Ofcom to ensure protections will be implemented as quickly as possible once the Act received Royal Assent.

From today, Ofcom will immediately begin work on tackling illegal content, with a consultation process launching on 9th November 2023. They will then take a phased approach to bringing the Online Safety Act into force, prioritising enforcing rules against the most harmful content as soon as possible.

The majority of the Act’s provisions will commence in two months’ time. However, the government has commenced key provisions early to establish Ofcom as the online safety regulator from today and allow them to begin key preparatory work such as consulting as quickly as possible to implement protections for the country.

Rocio Concha, Which? Director of Policy and Advocacy, said:Which? led the campaign for consumers to have stronger protections against scam adverts on social media platforms and search engines that can have devastating financial and emotional consequences for victims.

“These new Online Safety laws are a major step forward in the fight back against fraud by forcing tech firms to step up and take more responsibility for stopping people being targeted by fraudulent online adverts.

“Ofcom must now develop codes of practice that will hold platforms to a high standard and be prepared to take strong enforcement action, including fines, against firms if they break the law.”