Social media platforms failing to protect girls from harm at every stage

  • New NSPCC research has found that even the most popular social media platforms are failing girls at every stage, making them vulnerable to grooming, abuse, and harassment.
  • This comes as polling by the children’s charity also shows that a strong majority of adults across GB and in Scotland (86%) believe tech companies are not doing enough to protect girls from harm on social media. 
  • Parents of girls aged 4-17 across GB highlighted contact from strangers (41%), online grooming (40%), bullying from other children (37%), and sexual abuse or harassment (36%) as their top four concerns when it came to their daughter’s experiences online. 

The NSPCC is calling on tech companies to rethink how social media platforms are designed and prioritise creating age-appropriate experiences for young girls online. 

Social media platforms, messaging apps and gaming platforms are failing to protect girls at every stage, according to new research from the NSPCC.  

The children’s charity commissioned PA Consulting to conduct a new report, Targeting of Girls Online, which identified a wide range of risks girls face across ten popular online platforms including grooming, harassment and abuse.  

As part of the research, fake profiles of a teenage girl were created on these sites. 

The report found that the detailed nature of the profiles made it too easy for adult strangers to pick out girls and send unsolicited messages to their accounts.  

Findings also highlighted how many of the features and functionalities employed by tech companies subliminally encourage young girls to increase their online networks, online consumption, and online activity – often at the expense of their own safety. 

In response the NSPCC is urging Ofcom to address the significant gaps in its Illegal Harms Codes which fail to take into account specific risks which would be mitigated by solutions found in the report. 

This comes as new YouGov polling for the children’s charity of 3,593 adults from across Great Britain, including 326 adults from Scotland, found that most respondents in both GB (86%) and in Scotland (86%) believe tech companies are doing too little to protect girls under the age of 18 on their platforms.  

The survey also polled parents with daughters (431 from across GB), who listed contact from strangers (41%), online grooming (40%), bullying from other children (37%), and sexual abuse or harassment (36%) as their top four concerns related to their child’s experience online.  

Half of the parents surveyed (52%) expressed concern over their daughter’s online experiences. 

The Targeting of Girls Online report analysed features and design choices of these platforms which expose girls to harm online – including abuse, harassment and exploitation from strangers. 

Proposed solutions include:   

  • all services conducting their own ‘abusability studies’ to identify risky features and functionalities, as well as testing any new feature before rolling it out. These tests must include a gendered analysis of likely risk 
  • social media apps should integrate screenshot capabilities into a reporting function, along with automatically detecting identifiable information in bios.  
  • social media apps should implement a “cooling off” period once a connection is made between users, resulting in increased restrictions on interactions. 
  • increased measures to prevent non trusted adults from being able to video call young users.  

In particular, Ofcom should develop best practice guidance for regulated services, which outlines how safety settings and other protections can be adapted based on children’s age.  

The regulator should then work with service providers, especially those most popular with children, to implement this guidance. 

Without these necessary safeguards, young users – in particular girls – remain highly vulnerable to unsafe online interactions. 

The NSPCC has long heard from young girls about their negative experiences online through Childline which encouraged them to undertake this research.  

One 15-year-old who contacted Childline said: “I’ve been sent lots of inappropriate images online recently, like pictures of naked people that I don’t want to see.

“At first, I thought they were coming from just one person, so I blocked them. But then I realised the stuff was coming from loads of random people I don’t know. I’m going to try disable ways people can add me, so hopefully I’ll stop getting this stuff.”* 

Rani Govender, Policy Manager for Child Safety Online, said: “Parents are absolutely right to be concerned about the risks their daughters’ are being exposed to online, with this research making it crystal clear that tech companies are not doing nearly enough to create age-appropriate experiences for girls. 

“We know both on and offline girls face disproportionate risks of harassment, sexual abuse, and exploitation. That’s why it’s so worrying that these platforms are fundamentally unsafe by design – employing features and dark patterns that are putting girls in potentially dangerous situations.  

“There needs to be a complete overhaul of how these platforms are built. This requires tech companies and Ofcom to step up and address how poor design can lead to unsafe spaces for girls. 

“At the same time Government must layout in their upcoming Violence against Women and Girls (VAWG) Strategy steps to help prevent child sexual offences and tackle the design failures of social media companies that put girls in harm’s way.” 

Young people looking for support on any of the issues mentioned, can contact Childline on 0800 1111 or visit Childline.org.uk. Childline is available to all young people until their 19th birthday.

Adults who are concerned about a child can contact the NSPCC Helpline by calling 0808 800 5000, or email: help@NSPCC.org.uk 

Published by

davepickering

Edinburgh reporter and photographer

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.