Social media platforms failing to protect girls from harm at every stage

  • New NSPCC research has found that even the most popular social media platforms are failing girls at every stage, making them vulnerable to grooming, abuse, and harassment.
  • This comes as polling by the children’s charity also shows that a strong majority of adults across GB and in Scotland (86%) believe tech companies are not doing enough to protect girls from harm on social media. 
  • Parents of girls aged 4-17 across GB highlighted contact from strangers (41%), online grooming (40%), bullying from other children (37%), and sexual abuse or harassment (36%) as their top four concerns when it came to their daughter’s experiences online. 

The NSPCC is calling on tech companies to rethink how social media platforms are designed and prioritise creating age-appropriate experiences for young girls online. 

Social media platforms, messaging apps and gaming platforms are failing to protect girls at every stage, according to new research from the NSPCC.  

The children’s charity commissioned PA Consulting to conduct a new report, Targeting of Girls Online, which identified a wide range of risks girls face across ten popular online platforms including grooming, harassment and abuse.  

As part of the research, fake profiles of a teenage girl were created on these sites. 

The report found that the detailed nature of the profiles made it too easy for adult strangers to pick out girls and send unsolicited messages to their accounts.  

Findings also highlighted how many of the features and functionalities employed by tech companies subliminally encourage young girls to increase their online networks, online consumption, and online activity – often at the expense of their own safety. 

In response the NSPCC is urging Ofcom to address the significant gaps in its Illegal Harms Codes which fail to take into account specific risks which would be mitigated by solutions found in the report. 

This comes as new YouGov polling for the children’s charity of 3,593 adults from across Great Britain, including 326 adults from Scotland, found that most respondents in both GB (86%) and in Scotland (86%) believe tech companies are doing too little to protect girls under the age of 18 on their platforms.  

The survey also polled parents with daughters (431 from across GB), who listed contact from strangers (41%), online grooming (40%), bullying from other children (37%), and sexual abuse or harassment (36%) as their top four concerns related to their child’s experience online.  

Half of the parents surveyed (52%) expressed concern over their daughter’s online experiences. 

The Targeting of Girls Online report analysed features and design choices of these platforms which expose girls to harm online – including abuse, harassment and exploitation from strangers. 

Proposed solutions include:   

  • all services conducting their own ‘abusability studies’ to identify risky features and functionalities, as well as testing any new feature before rolling it out. These tests must include a gendered analysis of likely risk 
  • social media apps should integrate screenshot capabilities into a reporting function, along with automatically detecting identifiable information in bios.  
  • social media apps should implement a “cooling off” period once a connection is made between users, resulting in increased restrictions on interactions. 
  • increased measures to prevent non trusted adults from being able to video call young users.  

In particular, Ofcom should develop best practice guidance for regulated services, which outlines how safety settings and other protections can be adapted based on children’s age.  

The regulator should then work with service providers, especially those most popular with children, to implement this guidance. 

Without these necessary safeguards, young users – in particular girls – remain highly vulnerable to unsafe online interactions. 

The NSPCC has long heard from young girls about their negative experiences online through Childline which encouraged them to undertake this research.  

One 15-year-old who contacted Childline said: “I’ve been sent lots of inappropriate images online recently, like pictures of naked people that I don’t want to see.

“At first, I thought they were coming from just one person, so I blocked them. But then I realised the stuff was coming from loads of random people I don’t know. I’m going to try disable ways people can add me, so hopefully I’ll stop getting this stuff.”* 

Rani Govender, Policy Manager for Child Safety Online, said: “Parents are absolutely right to be concerned about the risks their daughters’ are being exposed to online, with this research making it crystal clear that tech companies are not doing nearly enough to create age-appropriate experiences for girls. 

“We know both on and offline girls face disproportionate risks of harassment, sexual abuse, and exploitation. That’s why it’s so worrying that these platforms are fundamentally unsafe by design – employing features and dark patterns that are putting girls in potentially dangerous situations.  

“There needs to be a complete overhaul of how these platforms are built. This requires tech companies and Ofcom to step up and address how poor design can lead to unsafe spaces for girls. 

“At the same time Government must layout in their upcoming Violence against Women and Girls (VAWG) Strategy steps to help prevent child sexual offences and tackle the design failures of social media companies that put girls in harm’s way.” 

Young people looking for support on any of the issues mentioned, can contact Childline on 0800 1111 or visit Childline.org.uk. Childline is available to all young people until their 19th birthday.

Adults who are concerned about a child can contact the NSPCC Helpline by calling 0808 800 5000, or email: help@NSPCC.org.uk 

Deep distrust of the police and lack of opportunity motivated children’s participation in last summer’s riot

  • Hundreds of children, some as young as 11, were caught up in riots last summer sparked by tragic murders of three girls in Southport.
  • At least 147 children arrested, 84 charged, 73 with finalised outcomes by October 31st.
  • Children’s Commissioner uses unique statutory powers to speak to around 20% of the children including some in Young Offenders Institutions charged in connection with last summer’s riots.
  • In interviews, many spoke strongly about their distrust of the police, describing previous bad experiences and community mistrust.
  • Postcode lottery with the youth justice system as outcomes depended on where they lived.

Unique research by the Children’s Commissioner’s Office found that young people who took part in last summer’s riots were not primarily driven by social media misinformation or racism but by curiosity of the events, deep distrust of the police or the lack of opportunities in their community.

Dame Rachel de Souza used her statutory powers to speak to about 20 per cent of the children who were charged in the aftermath of the summer riots that broke out after the tragic murders of Bebe King, Elsie Dot Stancombe and Alice da Silva Aguiar in Southport on 29 July 2024.

Hundreds of children – some as young as 11 – were caught up in the unrest in 26 areas across England following the lead of thousands of adults whose involvement was deemed to be racially motivated targeting locations known to house asylum seekers.

The findings of today’s report focus on children’s motivation for taking part, challenging the prevailing narrative that young people’s involvement was orchestrated by deliberate misinformation spread through social media linked to racist and right-wing influencers.

While these factors played a potential role, they did not appear to drive children’s actions.

Instead, many children’s involvement in the riots was spontaneous, not thought out and opportunistic. The report found that they were not primarily driven by far-right, anti-immigration or racist views. Children spoke about their curiosity of these events and their animosity towards the police.

Children’s Commissioner for England Dame Rachel de Souza said: “Like everyone I was truly horrified and heartbroken by the deaths of those three little girls in Southport last July.

“The initial response from the community to their deaths brought out some of the best of humanity, as people shared their collective grief and shock. But within a day, violent unrest started to unfold across the country in an apparent response to claims made about the girls’ attacker.

“The involvement of children in those riots and the reasons they told me they got involved raise some really serious questions about childhood in England and why our children feel so disaffected and disempowered.”

Based on interviews by the Children’s Commissioner’s office between November and December 2024 with children charged in connection to last summer’s riots, the report presents their views, as told to the Children’s Commissioner and her team directly – with key findings including:

  • Scale of youth involvement:  At least 147 children arrested, 84 charged, 73 with finalised outcomes by October 31st. Further arrests are anticipated as police continue to review evidence.
  • Spontaneous participation: Children’s actions were often impulsive and unconsidered, driven by curiosity, a sense of animosity towards the police, or the thrill of the moment – not primarily driven by far-right ideologies as widely speculated.
  • Distrust of the police: Many children cited previous negative interactions and deep-seated mistrust of the police within their community, which fuelled their actions during the riots, viewing them as an opportunity to retaliate against the police. 
  • Calls for change: Children identified poverty, a lack of youth activities and limited employment opportunities as underlying vulnerabilities that must be addressed to protect young people from crime and exploitation.

Today’s report by the Children’s Commissioner found the government’s response to the riots resulted in unusually severe charges and sentences, often overlooking children’s potential for rehabilitation.

Outcomes for children appeared to vary based on location, with inconsistent application of child-first principles and underutilised expertise of the Youth Justice Service (YJS).

Dame Rachel de Souza said: “As Children’s Commissioner, it’s my duty to listen to children, regardless of their circumstances. This includes hearing the voices of young victims, and in exceptional circumstances like this, hearing directly from children accused of perpetrating violence against others.

“These conversations were striking, and often unsettling. Many described impulsive decisions, driven by disaffection or distrust of the police as factors for their involvement.

“This report does not excuse criminality. The harm caused by these children’s actions is undeniable. Many – but not all – of the children acknowledged the need for accountability and consequences for their actions.

“Today’s findings offer no simple solutions but paints a more complex picture than has been debated following the riots. However, it is one that we must grapple with in order to create a more positive experience of childhood than one this report sets out.”

In her report, the Children’s Commissioner highlights the importance of upholding the child-first principles of the youth justice system, particularly in times of national crisis. Children are different to adults and a child must be seen as such first and foremost, rather than as an offender, to keep communities safe by preventing and reducing offending behaviour.

Rehabilitation and addressing the underlying causes of children’s involvement must be the primary objective of youth justice with custodial sentences always the last resort. The widespread expression of hostility toward the police among these children also highlights an urgent need for child-centred policing that builds trust and fosters positive relationships.

Today’s report, ‘Children’s involvement in the 2024 riots’ is available online.

Nearly half of adults in Scotland don’t consider planning for their digital legacy in their Wills, new poll reveals

A concerning number of adults in Scotland risk leaving grieving loved ones without access to cherished memories and vital information by neglecting to plan for their digital legacy, a new survey by Will Aid shows. 

The national Will-writing campaign has revealed 44% of respondents in Scotland overlooked the critical need to include digital assets in estate planning – meaning friends and family may face significant challenges in the event of their death, including the loss of treasured photographs, and difficulties in managing financial affairs. 

As the world becomes increasingly digital, our online lives leave behind an important, but often overlooked, legacy. 

The rise of digital banking, cloud storage, and the prevalence of social media means that a person’s online presence and assets can be just as valuable – if not more so – than their physical belongings. Yet, many individuals fail to consider this when preparing their Will, so sorting out the deceased’s estate becomes a more complicated task than it needs to be, adding stress to an already difficult time. 

Michael Cressey, from Hadfield Bull and Bull Solicitors, said: “In an age where so much of our lives are online, ensuring loved ones have access to your digital accounts after you die is crucial.  

“Many people do not realise how much valuable information is stored in their email and online profiles – from financial records to cherished photographs. Failing to leave clear instructions and passwords can cause significant emotional and logistical hardship for those left behind.  

“Leaving instructions for digital assets in a safe way not only ensures access to important assets but can also help loved ones manage practical matters such as closing accounts, settling bills, and even notifying institutions of the death. There are ways that you can update your online accounts with Apple iPhone by using the ‘legacy’ function in your phone settings, which will help you plan for the future.”  

The annual Will Aid campaign sees solicitors across the UK volunteering their time to write Wills throughout November, making it an ideal opportunity for people to get their wishes professionally drafted in a legal document, which will help to protect their loved ones in the future. 

Peter de Vena Franks, Will Aid Campaign Director, said: “By planning ahead, individuals can help ensure their online legacy is managed according to their wishes, and spare their loved ones from additional stress. 

“This year’s Will Aid campaign is the ideal time to talk to a solicitor, and ensure their wishes are clearly documented, giving them peace of mind that their loved ones will be spared additional upset and stress in the event of their death.” 

Will Aid is a partnership between the legal profession and seven of the UK’s best-loved charities.  

The initiative, which has been running for more than 30 years, sees participating solicitors waive their fee for writing basic Wills every November. 

Instead, they invite clients to make an upfront donation to Will Aid – a suggested £100 for a single basic Will and £180 for a pair of basic ‘mirror’ Wills. 

Appointments are available now, and you can sign up by visiting www.willaid.org.uk  

Donations to the campaign are shared by Will Aid’s partner charities, which operate both here in the UK and around the world. 

For more information on Will Aid and how to get involved visit www.willaid.org.uk  

Social media safety for young people

Campaign to stop the sharing of violent incidents

A national campaign to support young people to safely navigate social media and prevent violence has been launched.

‘Quit Fighting For Likes’ aims to get young people to think about and discuss attitudes and behaviours around the filming and sharing of violent incidents.

Developed by the Scottish Violence Reduction Unit (SVRU), YouthLink Scotland and Medics Against Violence (MAV), the new campaign is part of an action plan agreed in the Scottish Government’s Violence Prevention Framework, published in May last year.

It features a short awareness-raising animation, illustrating the digital world where this content can take place and showing an alternative route to switch off from it. A set of memes has also been produced covering a range of messages about why filming and sharing fights is damaging.

Young people helped develop the campaign through focus groups and feedback sessions involving various schools and youth groups, including pupils from Craigmount High School in Edinburgh.

The campaign has been launched as the first annual progress report for Scotland’s Violence Prevention Framework was published – highlighting progress made to help cut violent crime and reduce the harm it causes.

Key developments in 2023-24 include:

  • the creation, by the SVRU, of a Violence Anonymous group, the first of its kind in Scotland, to help individuals with significant problems turn their lives around
  • the extension of MAV’s hospital-based Navigator programme to reach young people in times of crisis, to receive support to steer them away from violence and harm
  • YouthLink Scotland’s training and resources provided to more than 700 practitioners across the country to provide young people with key messages on violence and knife crime prevention

Speaking at the launch of the new campaign in Edinburgh, Minister for Victims and Community Safety Siobhian Brown said: “While social media can play a positive role in young people’s lives, helping them engage with their friends and family, it can also be a platform where violent imagery is spread. This campaign will encourage young people to switch off and not share harmful content.

“Scotland’s Violence Prevention Framework is making encouraging progress with a number of partner initiatives focused on prevention and early intervention so that communities across Scotland remain safe and more people live free from the threat of violence.”

Tim Frew, CEO YouthLink Scotland, the national agency for youth work, said: “Young people have told us time and time again that they need help to navigate social media. It is crucial that adults who live and work with young people are confident in providing trusting and non-judgemental support.

“As the national agency for youth work, we are proud to have collaborated on this important campaign, embedding a youth work approach to the resources to start the conversation and upskill practitioners working with young people. By working and learning alongside young people, the toolkit supports young people to make informed, positive, and importantly safe, choices online.”

Prof Christine Goodall, Director and Founder of Medics Against Violence, said: “The use of social media to incite violence is something we couldn’t have anticipated 15 years ago but now we see that regularly along with the sharing of distressing images and videos of violence filmed in places that should be safe, such as school playgrounds and community public spaces.

“As health professionals we recognise the impact that may have on encouraging young people to get involved in violence, risking injury, and the long-term psychological impact on those filmed when their images are shared in the online space, without their consent or knowledge.

“This campaign is important to us because we understand from speaking to young people how conflicted they are about social media and the peer pressure they face to join in with image sharing activities. We wanted to produce something that would reflect their views and would support them to take a stand against activity which is both damaging and pervasive.”

Jimmy Paul, Head of the Scottish Violence Reduction Unit, said: “While the majority of young people in Scotland do not engage in the filming and sharing of violent incidents on social media, as part of our research for this campaign we listened to groups of young people about their experience.

“The Quit Fighting For Likes campaign aims to enable young people to look at attitudes and behaviours regarding social media while pointing towards the toolkit to equip those working with young people to help build positive social norms.”

Quit Fighting For Likes campaign.  

First Minister: Social media companies need to address online hate

John Swinney says action is needed to address misinformation, racism and hateful online material

Following disorder in parts of the UK, First Minister John Swinney has written to X, Meta and TikTok to ask what action they are taking to combat the spread of misinformation, and to address racist and hateful material on their platforms:

From: First Minister John Swinney

To:  X, Meta and TikTok

This week I met with representatives of faith and refugees groups to show solidarity with communities around the country.  They were clear to me about the impact of social media in spreading misinformation, raising alarm and the sense of threat in their communities.

I also met with Scottish political party leaders and Police Scotland representatives to discuss the situation in Scotland and the rest of the UK.

Police Scotland described social media posts that contain deliberate misinformation, with provocative and incendiary language with some potentially meeting the threshold for charge under Scotland’s hate crime legislation that came into effect in April this year.

It is clear to me that social media platforms have a duty to take action to ensure that individuals in our society are not subjected to hate and threatening behaviour, and that communities are protected from violent disorder.

I was struck by the communication from Ofcom this week reminding social media companies of their obligation to remove material that incites hatred or violence.

All political parties in Scotland stand together in resisting the prejudice and islamophobia that we have seen on the street in parts of the UK and online. 

Everyone has a role in stopping the spread of misinformation.  You and your platform have a specific responsibility to do so.

I would therefore be grateful if you could outline the action you are taking to combat the spread of misinformation on your platform and what steps being taken to address racist/hateful speech across your platform.  Given the seriousness of the situation action needs to be immediate and decisive. 

Police Scotland has specifically raised with me concerns about the time it takes for problematic posts to be removed when these are identified by law enforcement agencies. This increases the risk of spread of malicious content. I would wish to understand the steps you are taking to address this, particularly for content that police identify as illegal or harmful.

I am copying this letter to Peter Kyle MP, the Secretary of State for Culture Media and Sport.

Ofcom: Proposed measures to improve children’s online safety

As the UK’s online safety regulator, we have published a package of proposed measures that social media and other online services must take to improve children’s safety when they’re online:

In this article, Ofcom explain some of the main measures and the difference we expect them to make. Whether you are a parent, carer or someone working with children, this can help you understand what is happening to help children in the UK live safer lives online.

Protecting children is a priority

Protecting children so they can enjoy the benefits of being online, without experiencing the potentially serious harms that exist in the online world, is a priority for Ofcom.

We’re taking action – setting out proposed steps online services would need to take to keep kids safer online, as part of their duties under the Online Safety Act.

Under the Act social media apps, search and other online services must prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography. They must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.

What will companies have to do to protect children online?

Firstly, online services must establish whether children are likely to access their site – or part of it. And secondly, if children are likely to access it, the company must carry out a further assessment to identify the risks their service poses to children, including the risk that come from the design of their services, their functionalities and algorithms. They then need to introduce various safety measures to mitigate these risks.



Our consultation proposes more than 40 safety measures that services would need to take – all aimed at making sure children enjoy safer screen time when they are online. These include:

  • Robust age checks – our draft Codes expect services to know which of their users are children in order to keep protect them from harmful content. In practice, this means that all services which don’t ban harmful content should introduce highly effective age-checks to prevent children from accessing the entire site or app, or age-restricting parts of it for adults-only access.
  • Safer algorithms – under our proposals, any service that has systems that recommend personalised content to users and is at a high risk of harmful content must design their algorithms to filter out the most harmful content from children’s feeds, and downrank other harmful content. Children must also be able to provide negative feedback so the algorithm can learn what content they don’t want to see.
  • Effective moderation – all services, like social media apps and search services, must have content moderation systems and processes to take quick action on harmful content and large search services should use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content. Other broader measures require clear policies from services on what kind of content is allowed, how content is prioritised for review, and for content moderation teams to be well-resourced and trained.

What difference will these measures make?

We believe these measures will improve children’s online experiences in a number of ways. For example:

  • Children will not normally be able to access pornography.
  • Children will be protected from seeing, and being recommended, potentially harmful content.
  • Children will not be added to group chats without their consent.
  • It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on.

Our consultation follows proposals we’ve already published for how children should be protected from illegal content and activity such as grooming, child sexual exploitation and abuse, as well as how children should be prevented from accessing pornographic content.

Next steps

Our consultation is open until 17 July and we welcome any feedback on the proposals. We expect to finalise our proposals and publish our final statement and documents in spring next year.

Please submit responses using the consultation response form (ODT, 108.1 KB).

Have you booked a Brazilian Butt Lift?

Edinburgh residents are being urged to contact the City of Edinburgh Council if they have booked a procedure known as a Brazilian Butt Lift (BBL) in the Capital this weekend.

The Council’s Environmental Health team has been made aware that there may be BBL operations taking place from Friday 26th April to Sunday 28th April through people responding to social media posts.

Other local authorities in the United Kingdom have received complaints after similar procedures were carried out resulting in people suffering serious health complications such as sepsis.

Cllr Neil Ross, Convener of the Regulatory Committee at the City of Edinburgh Council, said: “We have been made aware that there may be procedures known as Brazilian Butt Lifts being performed in Edinburgh this weekend and we have concerns about the safety of such procedures.

“We are concerned about the potential risk to public health and would urge anyone who may have booked such a procedure this weekend to contact us as a matter of urgency.”

Anyone who may have a BBL procedure booked from Friday 26 April to Sunday 28 April in Edinburgh should e-mail environmentalhealth@edinburgh.gov.uk or phone 0131 200 2000.

Smartphone Free Childhood call for WhatsApp to reverse age reduction policy

Smartphone Free Childhood, the grassroots parents’ movement, has called on WhatsApp to reverse today’s change in age policy, which lowers the minimum age of use from 16 to 13 years old. 

As of April 11th, anyone in Europe over the age of 12 can now legally access the messaging service, after WhatsApp made a planned change to its age restriction policy. 

This comes despite growing national calls for greater protections for children around smartphone and social media use, including from the 60,000 parents who have joined Smartphone Free Childhood since it launched spontaneously eight weeks ago.

A recent nationwide poll found that 95% of parents said they wanted big tech companies to do more to protect their children, with 80% believing that age limits on social media were too low. 

Daisy Greenwell, co-founder of Smartphone Free Childhood said: “WhatsApp are putting shareholder profits first and children’s safety second.

“Reducing their age of use from 16 to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike. This policy boosts their user figures and maximises shareholder profits at the expense of children’s safety.

“Lowering their age restrictions from 16 to 13 sends a message to parents that WhatsApp is safe for those over the age of 12, and yet a growing body of research suggests otherwise.

“Meanwhile parents and teachers in the Smartphone Free Childhood community regularly reach out to us to share that it is a contributor to bullying, sleep disruption and access to harmful content.”

Meanwhile a growing body of research continues to raise serious questions about how suitable closed group messaging apps are for children and young teens. One recent study² found that 56% of students aged 8-18 reported that they had experienced cyberbullying in their class WhatsApp groups. 

Elsewhere, heavy use of screen media has been associated with shorter sleep duration and more mid-sleep awakening (in a study of more than 11,000 British children³) and many teachers have anecdotally reported to Smartphone Free Childhood that late night activity on WhatsApp is an increasing problem affecting children’s mood and ability to concentrate in class. 

Speaking about her recent report in partnership with the Association of School and College Leaders, Dr Kaitlyn Regehr Associate Professor of Digital Humanities at UCL said: “Our report shows that private, or closed, groups can enable more extreme material being shared, which in turn can have implications for young people’s offline behaviours.

Young people increasingly exist within digital echo-chambers, which can normalise harmful rhetoric.”

Furthermore, numerous reports link WhatsApp to children accessing extreme content – including sexual imagery, self-harm material5 and videos of extreme violence such as beheadings and terrorist attacks. Studies proves that nearly a quarter of people viewing such content on social media will experience symptoms of PTSD6.

Meanwhile, the end-to-end encryption on WhatsApp threatens children’s safety on the app, making it hard for parents to understand who their children are talking to and leaving them at risk of grooming by sexual predators.

One in ten children report7 using the messaging site to talk to people they don’t already know, and one in six 14-16 year-olds have received something distressing from a stranger on the site. 

Despite these significant concerns, WhatsApp have as yet given no indication of how they plan to protect all the new under-16 users on their site, or how they will comply with UK law to remove the millions of under-13s already on the platform. 

Child sexual abuse image crimes at record high in Scotland last year

  • Child sexual abuse image offences recorded by Police Scotland increased by 15 per cent between April 2022 and March 2023
  • NSPCC wants robust implementation of the Online Safety Act with Ofcom encouraged to strengthen its approach to tackling child sexual abuse
  • Meta urged to pause rollout of end-to-end encryption until plans for Facebook and Instagram can be risk assessed under new online safety regulations

The number of child sexual abuse image offences recorded by Police Scotland were at a record high last year – up by 15 per cent from the previous year, data analysed by the NSPCC has revealed.

A total of 765 offences where child abuse images were collected and distributed, were logged in 2022/23 according to Police Scotland data 1.  

Since 2017/18, when the NSPCC first called for social media regulation, a total of 3,877 crimes have been recorded while children and families have waited for online safety laws.

The charity said the figures show the need for swift and ambitious action by tech companies to address what is currently happening on their platforms and for Ofcom to significantly strengthen its approach to tackling child sexual abuse through effective enforcement of the Online Safety Act.

The figures come as insight from Childline shows young people being targeted by adults to share child sexual abuse images via social media and the calculated use of end-to-end encrypted private messaging apps by adults to find and share child abuse images.

A 14-year-old girl told the NSPCC-run service: “One night I got chatting with this guy online who I’d never met and he made me feel so good about myself. He told me he was 15, even though deep down I didn’t believe him.

“I sent him a couple of semi-nudes on Snap(chat), but then instantly regretted it. I asked him to delete the pics, but he just kept on making me do stuff for him not to post them – like getting me to strip live on camera. I just want to block him, but if I block him he will just post the pictures.”

A 15-year-old boy told Childline: “A while ago I saw a video on YouTube about how a guy was busting paedophiles and creeps on the internet by pretending to be a kid, and I kind of wanted to do a similar thing.

“I looked around Instagram for the creepiest accounts about kids my age and younger. In the end, I came across this link on one of their stories. It’s a link to a WhatsApp group chat in which [child sexual abuse material] is sent daily! There are literally hundreds of members in this group chat and they’re always calling the kids ‘hot’ and just being disgusting.”

  1. Police Scotland recorded crime data on the Scottish Government website.
Police Force2017/182018/192019/202020/212021/222022/23Total
Scotland6585545846606627653877

Online Safety Act implementation

The NSPCC said that disrupting online child sexual abuse taking place at increasing levels will require regulated tech platforms to introduce systemic changes to their products to stop them being used to organise, commit, and share child abuse.

A consultation into Ofcom’s first codes for companies to adopt to disrupt child sexual abuse on their platforms closed last week.

The NSPCC want these measures introduced without delay but urged Ofcom to begin work on a second version of the codes that will require companies to go much further.

The charity said companies should be required to use technology that can help identify and tackle grooming, sextortion and new child abuse images.

They also want tougher measures for private messaging services to make child protection a priority, including in end-to-end encrypted environments.

The NSPCC warned that Meta’s roll-out of end-to-end encryption on Facebook and Instagram will prevent authorities from identifying offenders and safeguarding victims.

The charity wants plans paused until Meta can prove child safety will not be compromised and have urged parties to find a balance between the safety and privacy of all users, including children. The NSPCC said further rollout should be delayed until Ofcom can study Meta’s risk assessment as part of the new regulatory regime.

Sir Peter Wanless, NSPCC Chief Executive, said: “It’s alarming to see online child abuse continue to rise, especially when tech companies should be acting to make their sites safe by design ahead of incoming regulation.

“Behind these crimes are children who have been targeted by adults who are able to organise and share sexual abuse with other offenders seamlessly across social media and messaging apps.

“The Online Safety Act sets out robust measures to make children fundamentally safer on the sites and apps they use so they can enjoy the benefits of a healthy online experience.

“Ofcom has been quick off the blocks but must act with greater ambition to ensure companies prioritise child safety in the comprehensive way that is so desperately needed.”

Susie Hargreaves OBE, Chief Executive of the Internet Watch Foundation, the UK’s front line against child sexual abuse imagery online, said: “This is a truly disturbing picture, and a reflection of the growing scale of the availability, and demand, for images and videos of children suffering sexual abuse.

“The people viewing and sharing and distributing this material need to know it is not a victimless crime. They are real children, suffering real abuse and sexual torture, the effects of which can linger a lifetime.

“That more and more people are trying to share and spread this material shows we should all be doing everything we can to stop this, building more, and innovative solutions to keep children safe.

“The IWF is ready to support technology companies and Ofcom in implementing the Online Safety Act to help make the UK the safest place in the world to be online.”

Hold the front page: Rebrand for Scottish Newspaper Society

The trade association for Scotland’s news publishers, the Scottish Newspaper Society, has been renamed Newsbrands Scotland, dropping “newspaper” from its title for the first time in a history stretching back 108 years.

While printed newspapers remain an important part of news publishers’ operations, the name change reflects modern newsrooms which reach far bigger audiences through digital platforms, with news operations working round the clock to deliver news to readers in the format they want, when they want it.

Newsbrands Scotland’s inaugural president, National World plc’s chief commercial officer Mark Hollinshead, said: “Our newsbrands reach more people than they ever did in the print-only days and the new name of our trade association reflects the multi-platform reality of the modern newsroom.”

Industry research [TGI, December 2022] shows that nine out of ten adults in Scotland engage with print or digital newsbrands at least once a week and are seven per cent more likely to rely on newspapers to stay informed than UK adults. And according to the latest JICREG analysis, 96 per cent of Scots read a local brand once a month.

Mark added: “Ever since the advent of the internet, Scottish news publishers have been evolving and innovating to keep their audiences well-served with up-to-the-minute, trusted information and analysis, and the audience figures speak for themselves.

“Scottish newsbrands keep communities across the country well-informed and connected, have a vital role to play in holding national and local politicians to account for the decisions they make, and are an essential means for services, businesses and charities to communicate with their users and customers.”

Further research from the news industry marketing body Newsworks reveals people are 2.4 times more likely to find news brands reliable than social media, and three-quarters believe it is important their news comes from a respected and recognised news provider.

Newsbrands Scotland director John McLellan said “Our titles continue to provide a depth and breadth of coverage that few, if any, networks can match, and the fact that all our members are independently regulated is also vital for maintaining public trust.

“Readers want to know they are being provided with professionally produced news, and our commercial partners benefit because readers recognise they are in a trusted environment.

“News publishers also continue to support and train the journalists of the future, and it’s important for our name to reflect an industry that is always looking forward.”

The rebranding project was a collaborative effort across member companies, with the branding design produced by DC Thomson and the marketing campaign devised by National World, with input from News UK and Newsquest Scotland.

“This was a very good example of publishers working together for the benefit of the whole sector in Scotland, whether society members or not,” added John McLellan.