Nearly half of adults in Scotland don’t consider planning for their digital legacy in their Wills, new poll reveals

A concerning number of adults in Scotland risk leaving grieving loved ones without access to cherished memories and vital information by neglecting to plan for their digital legacy, a new survey by Will Aid shows. 

The national Will-writing campaign has revealed 44% of respondents in Scotland overlooked the critical need to include digital assets in estate planning – meaning friends and family may face significant challenges in the event of their death, including the loss of treasured photographs, and difficulties in managing financial affairs. 

As the world becomes increasingly digital, our online lives leave behind an important, but often overlooked, legacy. 

The rise of digital banking, cloud storage, and the prevalence of social media means that a person’s online presence and assets can be just as valuable – if not more so – than their physical belongings. Yet, many individuals fail to consider this when preparing their Will, so sorting out the deceased’s estate becomes a more complicated task than it needs to be, adding stress to an already difficult time. 

Michael Cressey, from Hadfield Bull and Bull Solicitors, said: “In an age where so much of our lives are online, ensuring loved ones have access to your digital accounts after you die is crucial.  

“Many people do not realise how much valuable information is stored in their email and online profiles – from financial records to cherished photographs. Failing to leave clear instructions and passwords can cause significant emotional and logistical hardship for those left behind.  

“Leaving instructions for digital assets in a safe way not only ensures access to important assets but can also help loved ones manage practical matters such as closing accounts, settling bills, and even notifying institutions of the death. There are ways that you can update your online accounts with Apple iPhone by using the ‘legacy’ function in your phone settings, which will help you plan for the future.”  

The annual Will Aid campaign sees solicitors across the UK volunteering their time to write Wills throughout November, making it an ideal opportunity for people to get their wishes professionally drafted in a legal document, which will help to protect their loved ones in the future. 

Peter de Vena Franks, Will Aid Campaign Director, said: “By planning ahead, individuals can help ensure their online legacy is managed according to their wishes, and spare their loved ones from additional stress. 

“This year’s Will Aid campaign is the ideal time to talk to a solicitor, and ensure their wishes are clearly documented, giving them peace of mind that their loved ones will be spared additional upset and stress in the event of their death.” 

Will Aid is a partnership between the legal profession and seven of the UK’s best-loved charities.  

The initiative, which has been running for more than 30 years, sees participating solicitors waive their fee for writing basic Wills every November. 

Instead, they invite clients to make an upfront donation to Will Aid – a suggested £100 for a single basic Will and £180 for a pair of basic ‘mirror’ Wills. 

Appointments are available now, and you can sign up by visiting www.willaid.org.uk  

Donations to the campaign are shared by Will Aid’s partner charities, which operate both here in the UK and around the world. 

For more information on Will Aid and how to get involved visit www.willaid.org.uk  

Social media safety for young people

Campaign to stop the sharing of violent incidents

A national campaign to support young people to safely navigate social media and prevent violence has been launched.

‘Quit Fighting For Likes’ aims to get young people to think about and discuss attitudes and behaviours around the filming and sharing of violent incidents.

Developed by the Scottish Violence Reduction Unit (SVRU), YouthLink Scotland and Medics Against Violence (MAV), the new campaign is part of an action plan agreed in the Scottish Government’s Violence Prevention Framework, published in May last year.

It features a short awareness-raising animation, illustrating the digital world where this content can take place and showing an alternative route to switch off from it. A set of memes has also been produced covering a range of messages about why filming and sharing fights is damaging.

Young people helped develop the campaign through focus groups and feedback sessions involving various schools and youth groups, including pupils from Craigmount High School in Edinburgh.

The campaign has been launched as the first annual progress report for Scotland’s Violence Prevention Framework was published – highlighting progress made to help cut violent crime and reduce the harm it causes.

Key developments in 2023-24 include:

  • the creation, by the SVRU, of a Violence Anonymous group, the first of its kind in Scotland, to help individuals with significant problems turn their lives around
  • the extension of MAV’s hospital-based Navigator programme to reach young people in times of crisis, to receive support to steer them away from violence and harm
  • YouthLink Scotland’s training and resources provided to more than 700 practitioners across the country to provide young people with key messages on violence and knife crime prevention

Speaking at the launch of the new campaign in Edinburgh, Minister for Victims and Community Safety Siobhian Brown said: “While social media can play a positive role in young people’s lives, helping them engage with their friends and family, it can also be a platform where violent imagery is spread. This campaign will encourage young people to switch off and not share harmful content.

“Scotland’s Violence Prevention Framework is making encouraging progress with a number of partner initiatives focused on prevention and early intervention so that communities across Scotland remain safe and more people live free from the threat of violence.”

Tim Frew, CEO YouthLink Scotland, the national agency for youth work, said: “Young people have told us time and time again that they need help to navigate social media. It is crucial that adults who live and work with young people are confident in providing trusting and non-judgemental support.

“As the national agency for youth work, we are proud to have collaborated on this important campaign, embedding a youth work approach to the resources to start the conversation and upskill practitioners working with young people. By working and learning alongside young people, the toolkit supports young people to make informed, positive, and importantly safe, choices online.”

Prof Christine Goodall, Director and Founder of Medics Against Violence, said: “The use of social media to incite violence is something we couldn’t have anticipated 15 years ago but now we see that regularly along with the sharing of distressing images and videos of violence filmed in places that should be safe, such as school playgrounds and community public spaces.

“As health professionals we recognise the impact that may have on encouraging young people to get involved in violence, risking injury, and the long-term psychological impact on those filmed when their images are shared in the online space, without their consent or knowledge.

“This campaign is important to us because we understand from speaking to young people how conflicted they are about social media and the peer pressure they face to join in with image sharing activities. We wanted to produce something that would reflect their views and would support them to take a stand against activity which is both damaging and pervasive.”

Jimmy Paul, Head of the Scottish Violence Reduction Unit, said: “While the majority of young people in Scotland do not engage in the filming and sharing of violent incidents on social media, as part of our research for this campaign we listened to groups of young people about their experience.

“The Quit Fighting For Likes campaign aims to enable young people to look at attitudes and behaviours regarding social media while pointing towards the toolkit to equip those working with young people to help build positive social norms.”

Quit Fighting For Likes campaign.  

First Minister: Social media companies need to address online hate

John Swinney says action is needed to address misinformation, racism and hateful online material

Following disorder in parts of the UK, First Minister John Swinney has written to X, Meta and TikTok to ask what action they are taking to combat the spread of misinformation, and to address racist and hateful material on their platforms:

From: First Minister John Swinney

To:  X, Meta and TikTok

This week I met with representatives of faith and refugees groups to show solidarity with communities around the country.  They were clear to me about the impact of social media in spreading misinformation, raising alarm and the sense of threat in their communities.

I also met with Scottish political party leaders and Police Scotland representatives to discuss the situation in Scotland and the rest of the UK.

Police Scotland described social media posts that contain deliberate misinformation, with provocative and incendiary language with some potentially meeting the threshold for charge under Scotland’s hate crime legislation that came into effect in April this year.

It is clear to me that social media platforms have a duty to take action to ensure that individuals in our society are not subjected to hate and threatening behaviour, and that communities are protected from violent disorder.

I was struck by the communication from Ofcom this week reminding social media companies of their obligation to remove material that incites hatred or violence.

All political parties in Scotland stand together in resisting the prejudice and islamophobia that we have seen on the street in parts of the UK and online. 

Everyone has a role in stopping the spread of misinformation.  You and your platform have a specific responsibility to do so.

I would therefore be grateful if you could outline the action you are taking to combat the spread of misinformation on your platform and what steps being taken to address racist/hateful speech across your platform.  Given the seriousness of the situation action needs to be immediate and decisive. 

Police Scotland has specifically raised with me concerns about the time it takes for problematic posts to be removed when these are identified by law enforcement agencies. This increases the risk of spread of malicious content. I would wish to understand the steps you are taking to address this, particularly for content that police identify as illegal or harmful.

I am copying this letter to Peter Kyle MP, the Secretary of State for Culture Media and Sport.

Ofcom: Proposed measures to improve children’s online safety

As the UK’s online safety regulator, we have published a package of proposed measures that social media and other online services must take to improve children’s safety when they’re online:

In this article, Ofcom explain some of the main measures and the difference we expect them to make. Whether you are a parent, carer or someone working with children, this can help you understand what is happening to help children in the UK live safer lives online.

Protecting children is a priority

Protecting children so they can enjoy the benefits of being online, without experiencing the potentially serious harms that exist in the online world, is a priority for Ofcom.

We’re taking action – setting out proposed steps online services would need to take to keep kids safer online, as part of their duties under the Online Safety Act.

Under the Act social media apps, search and other online services must prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography. They must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.

What will companies have to do to protect children online?

Firstly, online services must establish whether children are likely to access their site – or part of it. And secondly, if children are likely to access it, the company must carry out a further assessment to identify the risks their service poses to children, including the risk that come from the design of their services, their functionalities and algorithms. They then need to introduce various safety measures to mitigate these risks.



Our consultation proposes more than 40 safety measures that services would need to take – all aimed at making sure children enjoy safer screen time when they are online. These include:

  • Robust age checks – our draft Codes expect services to know which of their users are children in order to keep protect them from harmful content. In practice, this means that all services which don’t ban harmful content should introduce highly effective age-checks to prevent children from accessing the entire site or app, or age-restricting parts of it for adults-only access.
  • Safer algorithms – under our proposals, any service that has systems that recommend personalised content to users and is at a high risk of harmful content must design their algorithms to filter out the most harmful content from children’s feeds, and downrank other harmful content. Children must also be able to provide negative feedback so the algorithm can learn what content they don’t want to see.
  • Effective moderation – all services, like social media apps and search services, must have content moderation systems and processes to take quick action on harmful content and large search services should use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content. Other broader measures require clear policies from services on what kind of content is allowed, how content is prioritised for review, and for content moderation teams to be well-resourced and trained.

What difference will these measures make?

We believe these measures will improve children’s online experiences in a number of ways. For example:

  • Children will not normally be able to access pornography.
  • Children will be protected from seeing, and being recommended, potentially harmful content.
  • Children will not be added to group chats without their consent.
  • It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on.

Our consultation follows proposals we’ve already published for how children should be protected from illegal content and activity such as grooming, child sexual exploitation and abuse, as well as how children should be prevented from accessing pornographic content.

Next steps

Our consultation is open until 17 July and we welcome any feedback on the proposals. We expect to finalise our proposals and publish our final statement and documents in spring next year.

Please submit responses using the consultation response form (ODT, 108.1 KB).

Have you booked a Brazilian Butt Lift?

Edinburgh residents are being urged to contact the City of Edinburgh Council if they have booked a procedure known as a Brazilian Butt Lift (BBL) in the Capital this weekend.

The Council’s Environmental Health team has been made aware that there may be BBL operations taking place from Friday 26th April to Sunday 28th April through people responding to social media posts.

Other local authorities in the United Kingdom have received complaints after similar procedures were carried out resulting in people suffering serious health complications such as sepsis.

Cllr Neil Ross, Convener of the Regulatory Committee at the City of Edinburgh Council, said: “We have been made aware that there may be procedures known as Brazilian Butt Lifts being performed in Edinburgh this weekend and we have concerns about the safety of such procedures.

“We are concerned about the potential risk to public health and would urge anyone who may have booked such a procedure this weekend to contact us as a matter of urgency.”

Anyone who may have a BBL procedure booked from Friday 26 April to Sunday 28 April in Edinburgh should e-mail environmentalhealth@edinburgh.gov.uk or phone 0131 200 2000.

Smartphone Free Childhood call for WhatsApp to reverse age reduction policy

Smartphone Free Childhood, the grassroots parents’ movement, has called on WhatsApp to reverse today’s change in age policy, which lowers the minimum age of use from 16 to 13 years old. 

As of April 11th, anyone in Europe over the age of 12 can now legally access the messaging service, after WhatsApp made a planned change to its age restriction policy. 

This comes despite growing national calls for greater protections for children around smartphone and social media use, including from the 60,000 parents who have joined Smartphone Free Childhood since it launched spontaneously eight weeks ago.

A recent nationwide poll found that 95% of parents said they wanted big tech companies to do more to protect their children, with 80% believing that age limits on social media were too low. 

Daisy Greenwell, co-founder of Smartphone Free Childhood said: “WhatsApp are putting shareholder profits first and children’s safety second.

“Reducing their age of use from 16 to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike. This policy boosts their user figures and maximises shareholder profits at the expense of children’s safety.

“Lowering their age restrictions from 16 to 13 sends a message to parents that WhatsApp is safe for those over the age of 12, and yet a growing body of research suggests otherwise.

“Meanwhile parents and teachers in the Smartphone Free Childhood community regularly reach out to us to share that it is a contributor to bullying, sleep disruption and access to harmful content.”

Meanwhile a growing body of research continues to raise serious questions about how suitable closed group messaging apps are for children and young teens. One recent study² found that 56% of students aged 8-18 reported that they had experienced cyberbullying in their class WhatsApp groups. 

Elsewhere, heavy use of screen media has been associated with shorter sleep duration and more mid-sleep awakening (in a study of more than 11,000 British children³) and many teachers have anecdotally reported to Smartphone Free Childhood that late night activity on WhatsApp is an increasing problem affecting children’s mood and ability to concentrate in class. 

Speaking about her recent report in partnership with the Association of School and College Leaders, Dr Kaitlyn Regehr Associate Professor of Digital Humanities at UCL said: “Our report shows that private, or closed, groups can enable more extreme material being shared, which in turn can have implications for young people’s offline behaviours.

Young people increasingly exist within digital echo-chambers, which can normalise harmful rhetoric.”

Furthermore, numerous reports link WhatsApp to children accessing extreme content – including sexual imagery, self-harm material5 and videos of extreme violence such as beheadings and terrorist attacks. Studies proves that nearly a quarter of people viewing such content on social media will experience symptoms of PTSD6.

Meanwhile, the end-to-end encryption on WhatsApp threatens children’s safety on the app, making it hard for parents to understand who their children are talking to and leaving them at risk of grooming by sexual predators.

One in ten children report7 using the messaging site to talk to people they don’t already know, and one in six 14-16 year-olds have received something distressing from a stranger on the site. 

Despite these significant concerns, WhatsApp have as yet given no indication of how they plan to protect all the new under-16 users on their site, or how they will comply with UK law to remove the millions of under-13s already on the platform. 

Child sexual abuse image crimes at record high in Scotland last year

  • Child sexual abuse image offences recorded by Police Scotland increased by 15 per cent between April 2022 and March 2023
  • NSPCC wants robust implementation of the Online Safety Act with Ofcom encouraged to strengthen its approach to tackling child sexual abuse
  • Meta urged to pause rollout of end-to-end encryption until plans for Facebook and Instagram can be risk assessed under new online safety regulations

The number of child sexual abuse image offences recorded by Police Scotland were at a record high last year – up by 15 per cent from the previous year, data analysed by the NSPCC has revealed.

A total of 765 offences where child abuse images were collected and distributed, were logged in 2022/23 according to Police Scotland data 1.  

Since 2017/18, when the NSPCC first called for social media regulation, a total of 3,877 crimes have been recorded while children and families have waited for online safety laws.

The charity said the figures show the need for swift and ambitious action by tech companies to address what is currently happening on their platforms and for Ofcom to significantly strengthen its approach to tackling child sexual abuse through effective enforcement of the Online Safety Act.

The figures come as insight from Childline shows young people being targeted by adults to share child sexual abuse images via social media and the calculated use of end-to-end encrypted private messaging apps by adults to find and share child abuse images.

A 14-year-old girl told the NSPCC-run service: “One night I got chatting with this guy online who I’d never met and he made me feel so good about myself. He told me he was 15, even though deep down I didn’t believe him.

“I sent him a couple of semi-nudes on Snap(chat), but then instantly regretted it. I asked him to delete the pics, but he just kept on making me do stuff for him not to post them – like getting me to strip live on camera. I just want to block him, but if I block him he will just post the pictures.”

A 15-year-old boy told Childline: “A while ago I saw a video on YouTube about how a guy was busting paedophiles and creeps on the internet by pretending to be a kid, and I kind of wanted to do a similar thing.

“I looked around Instagram for the creepiest accounts about kids my age and younger. In the end, I came across this link on one of their stories. It’s a link to a WhatsApp group chat in which [child sexual abuse material] is sent daily! There are literally hundreds of members in this group chat and they’re always calling the kids ‘hot’ and just being disgusting.”

  1. Police Scotland recorded crime data on the Scottish Government website.
Police Force2017/182018/192019/202020/212021/222022/23Total
Scotland6585545846606627653877

Online Safety Act implementation

The NSPCC said that disrupting online child sexual abuse taking place at increasing levels will require regulated tech platforms to introduce systemic changes to their products to stop them being used to organise, commit, and share child abuse.

A consultation into Ofcom’s first codes for companies to adopt to disrupt child sexual abuse on their platforms closed last week.

The NSPCC want these measures introduced without delay but urged Ofcom to begin work on a second version of the codes that will require companies to go much further.

The charity said companies should be required to use technology that can help identify and tackle grooming, sextortion and new child abuse images.

They also want tougher measures for private messaging services to make child protection a priority, including in end-to-end encrypted environments.

The NSPCC warned that Meta’s roll-out of end-to-end encryption on Facebook and Instagram will prevent authorities from identifying offenders and safeguarding victims.

The charity wants plans paused until Meta can prove child safety will not be compromised and have urged parties to find a balance between the safety and privacy of all users, including children. The NSPCC said further rollout should be delayed until Ofcom can study Meta’s risk assessment as part of the new regulatory regime.

Sir Peter Wanless, NSPCC Chief Executive, said: “It’s alarming to see online child abuse continue to rise, especially when tech companies should be acting to make their sites safe by design ahead of incoming regulation.

“Behind these crimes are children who have been targeted by adults who are able to organise and share sexual abuse with other offenders seamlessly across social media and messaging apps.

“The Online Safety Act sets out robust measures to make children fundamentally safer on the sites and apps they use so they can enjoy the benefits of a healthy online experience.

“Ofcom has been quick off the blocks but must act with greater ambition to ensure companies prioritise child safety in the comprehensive way that is so desperately needed.”

Susie Hargreaves OBE, Chief Executive of the Internet Watch Foundation, the UK’s front line against child sexual abuse imagery online, said: “This is a truly disturbing picture, and a reflection of the growing scale of the availability, and demand, for images and videos of children suffering sexual abuse.

“The people viewing and sharing and distributing this material need to know it is not a victimless crime. They are real children, suffering real abuse and sexual torture, the effects of which can linger a lifetime.

“That more and more people are trying to share and spread this material shows we should all be doing everything we can to stop this, building more, and innovative solutions to keep children safe.

“The IWF is ready to support technology companies and Ofcom in implementing the Online Safety Act to help make the UK the safest place in the world to be online.”

Hold the front page: Rebrand for Scottish Newspaper Society

The trade association for Scotland’s news publishers, the Scottish Newspaper Society, has been renamed Newsbrands Scotland, dropping “newspaper” from its title for the first time in a history stretching back 108 years.

While printed newspapers remain an important part of news publishers’ operations, the name change reflects modern newsrooms which reach far bigger audiences through digital platforms, with news operations working round the clock to deliver news to readers in the format they want, when they want it.

Newsbrands Scotland’s inaugural president, National World plc’s chief commercial officer Mark Hollinshead, said: “Our newsbrands reach more people than they ever did in the print-only days and the new name of our trade association reflects the multi-platform reality of the modern newsroom.”

Industry research [TGI, December 2022] shows that nine out of ten adults in Scotland engage with print or digital newsbrands at least once a week and are seven per cent more likely to rely on newspapers to stay informed than UK adults. And according to the latest JICREG analysis, 96 per cent of Scots read a local brand once a month.

Mark added: “Ever since the advent of the internet, Scottish news publishers have been evolving and innovating to keep their audiences well-served with up-to-the-minute, trusted information and analysis, and the audience figures speak for themselves.

“Scottish newsbrands keep communities across the country well-informed and connected, have a vital role to play in holding national and local politicians to account for the decisions they make, and are an essential means for services, businesses and charities to communicate with their users and customers.”

Further research from the news industry marketing body Newsworks reveals people are 2.4 times more likely to find news brands reliable than social media, and three-quarters believe it is important their news comes from a respected and recognised news provider.

Newsbrands Scotland director John McLellan said “Our titles continue to provide a depth and breadth of coverage that few, if any, networks can match, and the fact that all our members are independently regulated is also vital for maintaining public trust.

“Readers want to know they are being provided with professionally produced news, and our commercial partners benefit because readers recognise they are in a trusted environment.

“News publishers also continue to support and train the journalists of the future, and it’s important for our name to reflect an industry that is always looking forward.”

The rebranding project was a collaborative effort across member companies, with the branding design produced by DC Thomson and the marketing campaign devised by National World, with input from News UK and Newsquest Scotland.

“This was a very good example of publishers working together for the benefit of the whole sector in Scotland, whether society members or not,” added John McLellan.

Online Safety Bill ready to become law

  • The Online Safety Bill has been signed off by the Houses of Parliament and will become law soon
  • the bill will make the UK the safest place in the world to be online by placing new duties on social media companies – honouring our manifesto commitment
  • the bolstered bill has been strengthened through debate, with firmer protections for children, more control for adults and clarity for social platforms

The Online Safety Bill has passed its final Parliamentary debate and is now ready to become law.

This major milestone means the government is within touching distance of delivering the most powerful child protection laws in a generation, while ensuring adults are better empowered to take control of their online lives, while protecting our mental health.

The bill takes a zero-tolerance approach to protecting children and makes sure social media platforms are held responsible for the content they host. If they do not act rapidly to prevent and remove illegal content and stop children seeing material that is harmful to them, such as bullying, they will face significant fines that could reach billions of pounds. In some cases, their bosses may even face prison.

The bill has undergone considerable parliamentary scrutiny in both the Houses and has come out with stronger protections for all.

Technology Secretary Michelle Donelan said: “The Online Safety Bill is a game-changing piece of legislation. Today, this government is taking an enormous step forward in our mission to make the UK the safest place in the world to be online.

“I am immensely proud of what we have achieved with this bill. Our common-sense approach will deliver a better future for British people, by making sure that what is illegal offline is illegal online. It puts protecting children first, enabling us to catch keyboard criminals and crack down on the heinous crimes they seek to commit.

“I am deeply thankful to the tireless campaigning and efforts of parliamentarians, survivors of abuse and charities who have all worked relentlessly to get this bill to the finish line.”

Without this groundbreaking legislation, the safety of children across the country would be at stake and the internet would remain a wild west of content, putting children’s lives and mental health at risk. The bill has a zero-tolerance approach to protecting children, meaning social media platforms will be legally responsible for the content they host and keeping children and young people safe online.

Social media platforms will be expected to:

  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

In addition to its firm protections for children, the bill empowers adults to take control of what they see online. It provides three layers of protection for internet users which will:

  1. Make sure illegal content will have to be removed
  2. Place a legal responsibility on social media platforms to enforce the promises they make to users when they sign up, through terms and conditions
  3. Offer users the option to filter out harmful content, such as bullying, that they do not want to see online

If social media platforms do not comply with these rules, Ofcom could fine them up to £18 million or 10% of their global annual revenue, whichever is biggest – meaning fines handed down to the biggest platforms could reach billions of pounds.

Also added to the bill are new laws to decisively tackle online fraud and violence against women and girls. Through this legislation, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.

The change in laws will make it easier to charge abusers who share intimate images and put more offenders behind bars and better protect the public. Those found guilty of this base offence have a maximum penalty of 6 months in custody.

Former Love Island star and campaigner Georgia Harrison said: “Violence against women and girls is so common, with one in three women in the UK having experienced online abuse or harassment.

“The Online Safety bill is going to help bring this to an end, by holding social media companies accountable to protect women and girls from online abuse.”

Under the bill, the biggest social media platforms will have to stop users being exposed to dangerous fraudulent adverts by blocking and removing scams, or face Ofcom’s huge new fines.

The government has recently strengthened the bill even further, by amending the law to force social media firms to prevent activity that facilitates animal cruelty and torture (such as paying or instructing torture). Even if this activity takes place outside the UK but is seen by users here, companies will be forced to take it down.

Anticipating the bill coming into force, the biggest social media companies have already started to take action. Snapchat has started removing the accounts of underage users and TikTok has implemented stronger age verification.

Ofcom Chief Executive, Dame Melanie Dawes said: “Today is a major milestone in the mission to create a safer life online for children and adults in the UK. Everyone at Ofcom feels privileged to be entrusted with this important role, and we’re ready to start implementing these new laws.

“Very soon after the bill receives Royal Assent, we’ll consult on the first set of standards that we’ll expect tech firms to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism.”

While the bill has been in progress, the government has been working closely with Ofcom to ensure changes will be implemented as quickly as possible when it becomes law.

The regulator will immediately begin work on tackling illegal content and protecting children’s safety, with its consultation process launching in the weeks after Royal Assent. It will then take a phased approach to bringing the Online Safety Bill’s into force.

Passing of the Online Safety Bill ‘a momentous day for children’ says NSPCC chief

  • The Online Safety Bill will finally require tech companies to make their sites safe by design for children
  • New laws come after years of campaigning and robust political scrutiny in Parliament
  • NSPCC CEO Sir Peter Wanless thanks survivors and bereaved parents and urges tech companies to seize the opportunity offered by regulation
  • Survivors of online child abuse tell how the Online Safety Bill will address further preventable harm to countless other children
  • Charity releases video with young people welcoming the Online Safety Bill

The NSPCC has welcomed the passing of the Online Safety Bill, a ground-breaking piece of legislation they say will radically change the landscape for children online.

After years of campaigning, tech companies will now have a legal duty to protect children from sexual abuse and harmful material on social media sites, gaming apps and messaging services.

The UK Government first promised regulation to help protect children online at the NSPCC’s annual conference in 2018, following the launch of the charity’s Wild West Web campaign

The charity has helped strengthen the legislation during its long journey through UK Parliament, ensuring that it results in regulation that comprehensively protects children online.

The charity says the legislation will mark a new era for children’s safety at a time when online child abuse offences are at a record high and children continue to be bombarded with harmful suicide and self-harm content on social media.

In August this year, NSPCC Scotland revealed that more than 3,500 online grooming crimes* had been recorded by Police Scotland over the past six years while the legislation was being discussed. Last year (2022/23), 593 Communicating Indecently with a Child offences were recorded in Scotland, with more than half of the offences against children under the age of 13.  

The Online Safety Bill was published in May 2021 and has been subject to robust scrutiny and debate by MPs, Lords and civil society.

Its importance was starkly highlighted by the inquest into the death of 14-year-old Molly Russell in September last year, which ruled that the self-harm and suicide content that Molly had been recommended on social media had contributed to her death.

Ruth Moss, whose 13-year-old daughter Sophie died by suicide after viewing harmful content online, joined forces with Molly’s father Ian Russell and other parents whose children died following exposure to harmful content online, to form the Bereaved Parents for Online Safety group to strengthen the protections in the Bill.

The Edinburgh nurse has been campaigning with the NSPCC for several years for robust new legislation that would force tech bosses to make their sites safe for children.

Ruth said: “I’m pleased that the Bill has passed. I have always argued that self-regulation by tech companies hasn’t worked. Numerous families, including mine have highlighted these failings over many years. So, I welcome the bill wholeheartedly. It is a big step in offering protection to children online.

“For at least two years, we struggled to keep my daughter Sophie safe online. In spite of removing devices, restricting internet use, implementing parental controls and having conversations about internet safety, these were not enough to prevent her from being exposed to websites that promoted self-harm, suicide and contained dark, graphic, harmful material. Complaining to internet and social media companies was either impossible or futile.

“The impact of Sophie viewing this harmful material was a deterioration in her existing mental health struggles, with devastating consequences. Sophie was 13 years old when she died by suicide. We will never truly recover from her death, and it is rightly every parents worse nightmare.

“This Online Safety Bill may not solve all the issues that children have online. But it is essential to start regulating online platforms. They have a duty of care to keep their users safe to the best of their ability.

“Of course, I do have some reservations about the Online Safety Bill. It is a new piece of legislation that has not had the chance to be tested – so there will be some unknowns. And no piece of legislation will be perfect. We will only really know if the legislation goes far enough over time. 

“The Bill will need to stay relevant. If we look at other initial law, it develops over time with the changing environments in which we live. Technology changes and with it, the legislation around it will need to keep up. But this is a good first step. It sends a message to tech companies that safety should not be compromised for the sake of profit and that tech companies cannot deny responsibility for keeping their service users safe on their websites.

“In my opinion, the enforcement of the Bill is key. This will be challenging. It will require Ofcom going up against some of the most powerful and influential organisations in the world. Ofcom will have a difficult job. Currently, I have confidence that they will do what is necessary to uphold the legislation where needed, however time will tell.

“As with any piece of complex legislation, there were amendments that did not get passed around legal but harmful content for adults, the appointment of a children’s advocate and other areas that I would like to have seen included in the bill. But again, no Bill is perfect, and I am pleased to see it passed.”

The Online Safety Bill has been shaped in large part by survivors of abuse, bereaved parents and young people themselves who have campaigned tirelessly to ensure the legislation leads to real-world change for children.

Aoife (19) from East Kilbride, South Lanarkshire, was 15 when she was exploited by a man online who pretended to be a teenager. She said: “He convinced me to send him photos and then blackmailed me with them.

“It was terrifying but luckily I knew to report to Child Exploitation and Online Protection (CEOP) and he was eventually convicted.

“I know this kind of thing happens all the time – we need the new law to stop it from hurting more lives.”

Sir Peter Wanless, NSPCC Chief Executive, said: “We are absolutely delighted to see the Online Safety Bill being passed through Parliament. It is a momentous day for children and will finally result in the ground-breaking protections they should expect online.

“At the NSPCC we hear from children about the completely unacceptable levels of abuse and harm they face online every day. That’s why we have campaigned strongly for change alongside brave survivors, families, young people and parliamentarians to ensure the legislation results in a much safer online world for children.

“Children can benefit greatly from life online. Tech companies can now seize the opportunity to embrace safety by design. The NSPCC is ready to help them listen to and understand the online experiences of their young users to help ensure every child feels safe and empowered online.”

The NSPCC’s commitment to protect children online does not end with the passing of the Bill and the charity will continue to advocate to ensure it results in truly safe online spaces for children.

Online Safety Bill Timeline:

  • 2014 NSPCC launches a campaign calling for a new offence to make online grooming a crime, by making it illegal for an adult to send a child a sexual message.  50,000 people signed our petition
  • 2015 The Government included the offence in the Sexual Offences Act 2015, but it took two more years of sustained campaigning before they finally brought the offence into force so police could take action and arrest offenders
  • April 2017 Sexual Communication with a Child became an offence
  • April 2017 The NSPCC first called on Government to bring in statutory regulation of social networks
  • Dec 2017 NSPCC call for tech companies to have a legal duty of care to keep children safe
  • April 2018 Launch of NSPCC’s Wild West Web campaign 
  • June 2018 Following an NSPCC campaign, then Culture Secretary Matt Hancock commits to legislate to protect children  
  • Feb 2019 Taming the Wild West Web was published outlining a plan for regulation 
  • April 2019 Government publishes the Online Harms White Paper 
  • January 2020 Online Harms paving bill, prepared by the Carnegie Trust and introduced by Lord McNally, was selected for its first reading in the Lords 
  • February 2020 Government publish initial consultation to the Online Harms White Paper, announcing Ofcom as the likely watchdog 
  • September 2020 NSPCC sets out six tests for the Online Harms Bill in its Regulatory Framework 
  • December 2020 Government published its Online Harms White Paper consultation response  
  • March 2021 NSPCC analysis of the consultation response found significant improvement is needed in a third of areas of proposed legislation if the Online Safety Bill is to extensively protect children from avoidable harm and abuse 
  • May 2021 Government publishes draft Online Safety Bill  
  • September 2021 Parliamentary scrutiny begins, and the NSPCC publish Duty to Protect – An assessment of the draft Online Safety Bill against the NSPCC’s six tests for protecting children 
  • October 2021 Facebook whistleblowerFrances Haugen gives evidence to the Joint Committee on the Draft Online Safety Bill 
  • December 2021 The joint committee on the Draft Online Safety Bill call for a number of changes to the legislation to better prevent child abuse 
  • January 2022 DCMS Committee back NSPCC’s call for the Online Safety Bill to put a legal duty on tech firms to disrupt the way offenders game social media design features to organise around material that facilitates abuse  
  • January 2022 The Petitions Committee also called for the Online Safety Bill to be strengthened 
  • March 2022 NSPCC urge Government to listen to the overwhelming consensus of Parliamentarians, civil society and the UK public to close significant gaps in the way Online Safety Bill responds to grooming 
  • March 2022 The Government publishes the Online Safety Bill 
  • April 2022 Online Safety Bill has its Second Reading and NSPCC publish its Time to Act report which sets out where the Bill can be improved to become a world-leading response to the online child abuse threat 
  • May 2022 Online Safety Bill Committee Stage begins 
  • July 2022 Online Safety Bill Report Stage debates  
  • Summer 2022 Online Safety Bill delayed by two changes to Prime Minister
  • September 2022 Inquest into the death of 14-year-old Molly Russell finds social media contributed to her death
  • December 2022 Online Safety Bill returns to Parliament
  • December 2022 Bereaved Families for Online Safety formed to campaign for strong protections for children and families through the Online Safety Bill
  • January 2023 Conservative MP rebellion backs NSPCC amendment that forces Government to commit to holding senior tech managers liable for harm to children
  • January 2023 Online Safety Bill begins its journey through the House of Lords
  • Spring 2023 Government amendments strengthen protections for children following campaigning by civil society, including NSPCC and Bereaved Families for Online Safety
  • September 2023 Online Safety Bill due its Third Reading in the House of Lords and to return to Parliament for final passage

New tech partnership with social media to ‘stop the boats’

  • Partnership with social media companies to clamp down on people smugglers’ operations online
  • Illegal crossings remain down on last year and returns are at their highest level since 2019
  • Extra funding and resources for law enforcement to tackle harmful content

A voluntary partnership between social media companies and government will accelerate action to tackle people smuggling content online, such as criminals sharing information about illegal Channel crossings, Prime Minister Rishi Sunak has announced today [Sunday 6th August].

It comes as new figures show the government continues to make progress on the Prime Minister’s plan to stop the boats: crossings remain down on last year, the legacy asylum backlog has been reduced by a third since December 2022, and enforced returns of people with no right to be in the UK are at their highest level since 2019.

While figures from the NCA show that over 90% of online content linked to people smuggling is taken down when social media companies are notified, the partnership between tech firms and government will drive forward efforts to clamp down on the tactics being used by criminal gangs who use the internet to lure people into paying for crossings.

This content can include discount offers for groups of people, free spaces for children, offers of false documents and false claims of safe passage – targeting vulnerable people for profit and putting people’s lives at risk through dangerous and illegal journeys.

Prime Minister Rishi Sunak said: “To stop the boats, we have to tackle the business model of vile people smugglers at source.

“That means clamping down on their attempts to lure people into making these illegal crossings and profit from putting lives at risk.

“This new commitment from tech firms will see us redouble our efforts to fight back against these criminals, working together to shut down their vile trade.”

Home Secretary Suella Braverman said: “Heartless people smugglers are using social media to promote their despicable services and charge people thousands of pounds to make the illegal journey into the UK in unsafe boats.

They must not succeed.

“This strengthened collaboration between the National Crime Agency, government and social media companies will ensure content promoting dangerous and illegal Channel crossings doesn’t see the light of day.”

The partnership will build on the close working already in place between government and social media companies, and includes a range of commitments to explore increased collaboration.

Under this initiative, social media companies will look to increase cooperation with the National Crime Agency to find and remove criminal content and step up the sharing of best practice both across the industry and with law enforcement.

The voluntary partnership also includes a commitment to explore ways to step up efforts to redirect people away from this content when they come across it online. This approach is already widely being used successfully by platforms, for example around harmful content promoting extremism or eating disorders, where people are presented with alternative messages to displace, rebut or undermine the damaging content they searched for – diverting them away from harmful messaging and misinformation.

Alongside the partnership, the government will also set up a new centre led by the National Crime Agency and Home Office to increase the capacity and capability of law enforcement to identify this content on social media platforms.

Known as the ‘Online Capability Centre’, backed by £11m funding, its work will focus on undermining and disrupting the business model of organised crime groups responsible for illegal crossings and using the internet to facilitate these journeys by intensifying efforts to combat their online activity.

The centre will be staffed by highly trained technical specialists alongside law enforcement officers and will work by building a clearer picture of the scale of illegal immigration material online.

They will work with internet companies to identify more of this material, notifying platforms so they can take the appropriate action. The centre will also focus on developing and building a bank of intelligence around the criminal networks who are promoting people smuggling services online, which will help improve law enforcement’s ability to identify content and in turn help drive investigations.

To harness the potential of new technology such as AI to clamp down on criminals’ content, government will also hold a ‘hackathon’ event with industry experts in order to develop innovative new tools which will better detect people smugglers’ publicly available content online, to help social media companies take it down more quickly.

Government will also intensify the existing work taking place with social media companies ahead of the Online Safety Bill coming into effect.

Once in force, under the Bill social media companies will be required to make sure their systems and processes are designed to prevent people coming into contact with illegal content created by people smugglers, minimise how long this content is available online and remove it as soon as possible once they become aware of it.

Alongside this, the Bill also requires major platforms to publish annual transparency reports setting out what they’re doing to tackle online harms. This could include information around how content around illegal migration is spread across platforms, how frequently it is uploaded, and what systems and processes companies have in place to deal with this kind of content.

The partnership confirmed today also builds on the work of the “Social Media Action Plan”, a voluntary agreement between the Home Office, National Crime Agency and five major social media platforms in 2021 to increase understanding of how organised criminals used their platforms to promote illegal services.

To date, this cooperation has seen more than 4,700 posts, pages or accounts have been removed or suspended as a result, increasing disruption of organised crime groups’ activity, and today’s partnership will drive further progress.

Stopping the boats is one of the Prime Minister’s top five priorities and the government is fully focused on delivering his whole system plan to tackling illegal migration. This includes:

  • stepping up law enforcement activity, with 50% more illegal working visits carried out in the first half of this year compared to the first half of last year
  • tackling the legacy asylum backlog, which has reduced by nearly a third since the end of December
  • passing the Illegal Migration Act which will ensure that people who come to the UK illegally will be detained and swiftly removed.

Working with international partners to tackle this global challenge is another key strand of efforts to stop the boats, and since taking office the PM has secured new agreements with allies, including strengthened partnerships with France and Albania which will see 40% more patrols on French beaches, and have resulted in a 90% drop in Albanian small boat arrivals in the first quarter of 2023 compared to the same period last year.