Online Safety Bill ready to become law

  • The Online Safety Bill has been signed off by the Houses of Parliament and will become law soon
  • the bill will make the UK the safest place in the world to be online by placing new duties on social media companies – honouring our manifesto commitment
  • the bolstered bill has been strengthened through debate, with firmer protections for children, more control for adults and clarity for social platforms

The Online Safety Bill has passed its final Parliamentary debate and is now ready to become law.

This major milestone means the government is within touching distance of delivering the most powerful child protection laws in a generation, while ensuring adults are better empowered to take control of their online lives, while protecting our mental health.

The bill takes a zero-tolerance approach to protecting children and makes sure social media platforms are held responsible for the content they host. If they do not act rapidly to prevent and remove illegal content and stop children seeing material that is harmful to them, such as bullying, they will face significant fines that could reach billions of pounds. In some cases, their bosses may even face prison.

The bill has undergone considerable parliamentary scrutiny in both the Houses and has come out with stronger protections for all.

Technology Secretary Michelle Donelan said: “The Online Safety Bill is a game-changing piece of legislation. Today, this government is taking an enormous step forward in our mission to make the UK the safest place in the world to be online.

“I am immensely proud of what we have achieved with this bill. Our common-sense approach will deliver a better future for British people, by making sure that what is illegal offline is illegal online. It puts protecting children first, enabling us to catch keyboard criminals and crack down on the heinous crimes they seek to commit.

“I am deeply thankful to the tireless campaigning and efforts of parliamentarians, survivors of abuse and charities who have all worked relentlessly to get this bill to the finish line.”

Without this groundbreaking legislation, the safety of children across the country would be at stake and the internet would remain a wild west of content, putting children’s lives and mental health at risk. The bill has a zero-tolerance approach to protecting children, meaning social media platforms will be legally responsible for the content they host and keeping children and young people safe online.

Social media platforms will be expected to:

  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

In addition to its firm protections for children, the bill empowers adults to take control of what they see online. It provides three layers of protection for internet users which will:

  1. Make sure illegal content will have to be removed
  2. Place a legal responsibility on social media platforms to enforce the promises they make to users when they sign up, through terms and conditions
  3. Offer users the option to filter out harmful content, such as bullying, that they do not want to see online

If social media platforms do not comply with these rules, Ofcom could fine them up to £18 million or 10% of their global annual revenue, whichever is biggest – meaning fines handed down to the biggest platforms could reach billions of pounds.

Also added to the bill are new laws to decisively tackle online fraud and violence against women and girls. Through this legislation, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.

The change in laws will make it easier to charge abusers who share intimate images and put more offenders behind bars and better protect the public. Those found guilty of this base offence have a maximum penalty of 6 months in custody.

Former Love Island star and campaigner Georgia Harrison said: “Violence against women and girls is so common, with one in three women in the UK having experienced online abuse or harassment.

“The Online Safety bill is going to help bring this to an end, by holding social media companies accountable to protect women and girls from online abuse.”

Under the bill, the biggest social media platforms will have to stop users being exposed to dangerous fraudulent adverts by blocking and removing scams, or face Ofcom’s huge new fines.

The government has recently strengthened the bill even further, by amending the law to force social media firms to prevent activity that facilitates animal cruelty and torture (such as paying or instructing torture). Even if this activity takes place outside the UK but is seen by users here, companies will be forced to take it down.

Anticipating the bill coming into force, the biggest social media companies have already started to take action. Snapchat has started removing the accounts of underage users and TikTok has implemented stronger age verification.

Ofcom Chief Executive, Dame Melanie Dawes said: “Today is a major milestone in the mission to create a safer life online for children and adults in the UK. Everyone at Ofcom feels privileged to be entrusted with this important role, and we’re ready to start implementing these new laws.

“Very soon after the bill receives Royal Assent, we’ll consult on the first set of standards that we’ll expect tech firms to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism.”

While the bill has been in progress, the government has been working closely with Ofcom to ensure changes will be implemented as quickly as possible when it becomes law.

The regulator will immediately begin work on tackling illegal content and protecting children’s safety, with its consultation process launching in the weeks after Royal Assent. It will then take a phased approach to bringing the Online Safety Bill’s into force.

Passing of the Online Safety Bill ‘a momentous day for children’ says NSPCC chief

  • The Online Safety Bill will finally require tech companies to make their sites safe by design for children
  • New laws come after years of campaigning and robust political scrutiny in Parliament
  • NSPCC CEO Sir Peter Wanless thanks survivors and bereaved parents and urges tech companies to seize the opportunity offered by regulation
  • Survivors of online child abuse tell how the Online Safety Bill will address further preventable harm to countless other children
  • Charity releases video with young people welcoming the Online Safety Bill

The NSPCC has welcomed the passing of the Online Safety Bill, a ground-breaking piece of legislation they say will radically change the landscape for children online.

After years of campaigning, tech companies will now have a legal duty to protect children from sexual abuse and harmful material on social media sites, gaming apps and messaging services.

The UK Government first promised regulation to help protect children online at the NSPCC’s annual conference in 2018, following the launch of the charity’s Wild West Web campaign

The charity has helped strengthen the legislation during its long journey through UK Parliament, ensuring that it results in regulation that comprehensively protects children online.

The charity says the legislation will mark a new era for children’s safety at a time when online child abuse offences are at a record high and children continue to be bombarded with harmful suicide and self-harm content on social media.

In August this year, NSPCC Scotland revealed that more than 3,500 online grooming crimes* had been recorded by Police Scotland over the past six years while the legislation was being discussed. Last year (2022/23), 593 Communicating Indecently with a Child offences were recorded in Scotland, with more than half of the offences against children under the age of 13.  

The Online Safety Bill was published in May 2021 and has been subject to robust scrutiny and debate by MPs, Lords and civil society.

Its importance was starkly highlighted by the inquest into the death of 14-year-old Molly Russell in September last year, which ruled that the self-harm and suicide content that Molly had been recommended on social media had contributed to her death.

Ruth Moss, whose 13-year-old daughter Sophie died by suicide after viewing harmful content online, joined forces with Molly’s father Ian Russell and other parents whose children died following exposure to harmful content online, to form the Bereaved Parents for Online Safety group to strengthen the protections in the Bill.

The Edinburgh nurse has been campaigning with the NSPCC for several years for robust new legislation that would force tech bosses to make their sites safe for children.

Ruth said: “I’m pleased that the Bill has passed. I have always argued that self-regulation by tech companies hasn’t worked. Numerous families, including mine have highlighted these failings over many years. So, I welcome the bill wholeheartedly. It is a big step in offering protection to children online.

“For at least two years, we struggled to keep my daughter Sophie safe online. In spite of removing devices, restricting internet use, implementing parental controls and having conversations about internet safety, these were not enough to prevent her from being exposed to websites that promoted self-harm, suicide and contained dark, graphic, harmful material. Complaining to internet and social media companies was either impossible or futile.

“The impact of Sophie viewing this harmful material was a deterioration in her existing mental health struggles, with devastating consequences. Sophie was 13 years old when she died by suicide. We will never truly recover from her death, and it is rightly every parents worse nightmare.

“This Online Safety Bill may not solve all the issues that children have online. But it is essential to start regulating online platforms. They have a duty of care to keep their users safe to the best of their ability.

“Of course, I do have some reservations about the Online Safety Bill. It is a new piece of legislation that has not had the chance to be tested – so there will be some unknowns. And no piece of legislation will be perfect. We will only really know if the legislation goes far enough over time. 

“The Bill will need to stay relevant. If we look at other initial law, it develops over time with the changing environments in which we live. Technology changes and with it, the legislation around it will need to keep up. But this is a good first step. It sends a message to tech companies that safety should not be compromised for the sake of profit and that tech companies cannot deny responsibility for keeping their service users safe on their websites.

“In my opinion, the enforcement of the Bill is key. This will be challenging. It will require Ofcom going up against some of the most powerful and influential organisations in the world. Ofcom will have a difficult job. Currently, I have confidence that they will do what is necessary to uphold the legislation where needed, however time will tell.

“As with any piece of complex legislation, there were amendments that did not get passed around legal but harmful content for adults, the appointment of a children’s advocate and other areas that I would like to have seen included in the bill. But again, no Bill is perfect, and I am pleased to see it passed.”

The Online Safety Bill has been shaped in large part by survivors of abuse, bereaved parents and young people themselves who have campaigned tirelessly to ensure the legislation leads to real-world change for children.

Aoife (19) from East Kilbride, South Lanarkshire, was 15 when she was exploited by a man online who pretended to be a teenager. She said: “He convinced me to send him photos and then blackmailed me with them.

“It was terrifying but luckily I knew to report to Child Exploitation and Online Protection (CEOP) and he was eventually convicted.

“I know this kind of thing happens all the time – we need the new law to stop it from hurting more lives.”

Sir Peter Wanless, NSPCC Chief Executive, said: “We are absolutely delighted to see the Online Safety Bill being passed through Parliament. It is a momentous day for children and will finally result in the ground-breaking protections they should expect online.

“At the NSPCC we hear from children about the completely unacceptable levels of abuse and harm they face online every day. That’s why we have campaigned strongly for change alongside brave survivors, families, young people and parliamentarians to ensure the legislation results in a much safer online world for children.

“Children can benefit greatly from life online. Tech companies can now seize the opportunity to embrace safety by design. The NSPCC is ready to help them listen to and understand the online experiences of their young users to help ensure every child feels safe and empowered online.”

The NSPCC’s commitment to protect children online does not end with the passing of the Bill and the charity will continue to advocate to ensure it results in truly safe online spaces for children.

Online Safety Bill Timeline:

  • 2014 NSPCC launches a campaign calling for a new offence to make online grooming a crime, by making it illegal for an adult to send a child a sexual message.  50,000 people signed our petition
  • 2015 The Government included the offence in the Sexual Offences Act 2015, but it took two more years of sustained campaigning before they finally brought the offence into force so police could take action and arrest offenders
  • April 2017 Sexual Communication with a Child became an offence
  • April 2017 The NSPCC first called on Government to bring in statutory regulation of social networks
  • Dec 2017 NSPCC call for tech companies to have a legal duty of care to keep children safe
  • April 2018 Launch of NSPCC’s Wild West Web campaign 
  • June 2018 Following an NSPCC campaign, then Culture Secretary Matt Hancock commits to legislate to protect children  
  • Feb 2019 Taming the Wild West Web was published outlining a plan for regulation 
  • April 2019 Government publishes the Online Harms White Paper 
  • January 2020 Online Harms paving bill, prepared by the Carnegie Trust and introduced by Lord McNally, was selected for its first reading in the Lords 
  • February 2020 Government publish initial consultation to the Online Harms White Paper, announcing Ofcom as the likely watchdog 
  • September 2020 NSPCC sets out six tests for the Online Harms Bill in its Regulatory Framework 
  • December 2020 Government published its Online Harms White Paper consultation response  
  • March 2021 NSPCC analysis of the consultation response found significant improvement is needed in a third of areas of proposed legislation if the Online Safety Bill is to extensively protect children from avoidable harm and abuse 
  • May 2021 Government publishes draft Online Safety Bill  
  • September 2021 Parliamentary scrutiny begins, and the NSPCC publish Duty to Protect – An assessment of the draft Online Safety Bill against the NSPCC’s six tests for protecting children 
  • October 2021 Facebook whistleblowerFrances Haugen gives evidence to the Joint Committee on the Draft Online Safety Bill 
  • December 2021 The joint committee on the Draft Online Safety Bill call for a number of changes to the legislation to better prevent child abuse 
  • January 2022 DCMS Committee back NSPCC’s call for the Online Safety Bill to put a legal duty on tech firms to disrupt the way offenders game social media design features to organise around material that facilitates abuse  
  • January 2022 The Petitions Committee also called for the Online Safety Bill to be strengthened 
  • March 2022 NSPCC urge Government to listen to the overwhelming consensus of Parliamentarians, civil society and the UK public to close significant gaps in the way Online Safety Bill responds to grooming 
  • March 2022 The Government publishes the Online Safety Bill 
  • April 2022 Online Safety Bill has its Second Reading and NSPCC publish its Time to Act report which sets out where the Bill can be improved to become a world-leading response to the online child abuse threat 
  • May 2022 Online Safety Bill Committee Stage begins 
  • July 2022 Online Safety Bill Report Stage debates  
  • Summer 2022 Online Safety Bill delayed by two changes to Prime Minister
  • September 2022 Inquest into the death of 14-year-old Molly Russell finds social media contributed to her death
  • December 2022 Online Safety Bill returns to Parliament
  • December 2022 Bereaved Families for Online Safety formed to campaign for strong protections for children and families through the Online Safety Bill
  • January 2023 Conservative MP rebellion backs NSPCC amendment that forces Government to commit to holding senior tech managers liable for harm to children
  • January 2023 Online Safety Bill begins its journey through the House of Lords
  • Spring 2023 Government amendments strengthen protections for children following campaigning by civil society, including NSPCC and Bereaved Families for Online Safety
  • September 2023 Online Safety Bill due its Third Reading in the House of Lords and to return to Parliament for final passage

More than 3,500 online grooming crimes against children recorded by Police Scotland while safety laws discussed

  • NSPCC urges tech companies and MPs to back Online Safety Bill following new research on scale of online grooming
  • Primary school children targeted in more than half of online grooming crimes in Scotland since social media regulation was first demanded

More than 3,500 online grooming crimes have been recorded by Police Scotland while children have been waiting for online safety laws, new figures published by the NSPCC reveal today.

Data from Police Scotland shows 593 Communicating Indecently with a Child offences were recorded last year (2022/23).

The new research shows that in Scotland, 1,873 offences took place against primary school children, with under-13s making up more than half of victims.

The new analysis of the scale of child sexual abuse taking place on social media comes ahead of MPs and Lords making final decisions on the Online Safety Bill next month.

The NSPCC first called for social media regulation to protect children from sexual abuse in 2017 and has been campaigning for robust legislation ever since.

The charity said the number of offences is likely to be far higher than those known to police. In response, they are urging politicians on all sides to support the Bill in its final stages and pass this vital legislation.

Aoife (19) from East Kilbride, South Lanarkshire, was exploited online when she was 15 by an adult male who pretended to be a teenager.

The man convinced her to send him images of herself and blackmailed her with these to control her behaviour. When his demands became increasingly intense and frightening, Aoife plucked up the courage to tell her mum and teachers, who helped them to report it to the police.

Aoife said: “When I found out I’d been talking to an older man I was petrified. I remember it was 3am and I was sitting in my room, just shaking. I felt like I was the only person in the world and started crying.

“I wanted my mum, and while she was just in the room next door I thought I couldn’t tell her because it’s so embarrassing, but all I wanted was a hug from her.”

A draft Online Safety Bill was published over two years ago but regulation was first promised by Government in 2018 following the NSPCC’s call for action and the launch of its Wild West Web campaign.

The charity has been campaigning for strong legislation ever since, working closely with survivors, Government, Parliamentarians, and other civil society groups to ensure it effectively tackles the way social media and gaming sites contribute to child sexual abuse.

The legislation will mean tech companies have a legal duty of care for young users and must assess their products for child abuse risks and put mitigations in place to protect children.

It will give the regulator Ofcom powers to address significant abuse taking place in private messaging and require companies to put safeguards in place to identify and disrupt abuse in end-to-end encrypted environments.

The NSPCC said these measures are vital to effectively protect children from the most insidious abuse and recent polling shows they are backed by more than seven in ten voters.

Sir Peter Wanless, NSPCC Chief Executive said: “Today’s research highlights the sheer scale of child abuse happening on social media and the human cost of fundamentally unsafe products.

“The number of offences must serve as a reminder of why the Online Safety Bill is so important and why the ground-breaking protections it will give children are desperately needed.

“We’re pleased the Government has listened and strengthened the legislation so companies must tackle how their sites contribute to child sexual abuse in a tough but proportionate way, including in private messaging.

“It’s now up to tech firms, including those highlighted by these stark figures today, to make sure their current sites and future services do not put children at unacceptable risk of abuse.”

As well as winning the commitment to legislate, the NSPCC has helped shape significant gains for children in the Online Safety Bill as it has passed through Parliament, including:

  • Senior tech bosses will be held criminally liable for significant failures that put children at risk of sexual abuse and other harm.
  • Girls will be given specific protections as Ofcom will produce guidance on tackling Violence Against Women and Girls for companies to follow.
  • Companies will have to crack down on so-called tribute pages and breadcrumbing that use legal but often stolen images of children and child accounts to form networks of offenders to facilitate child sexual abuse.
  • Sites will have to consider how grooming pathways travel across various social media apps and games and work together to prevent abuse spreading across different platforms.

The NSPCC is still seeking assurances that the legislation will effectively regulate AI and immersive technology and wants an online child safety advocacy body specifically to speak with and for children as part of the day-to-day regulatory regime. They argue that this will help spot emerging risks and fight for the interests and safety of children before tragedies arise.

The charity are asking campaigners to reach out to MPs with personal messages about why they should act to make the online world safer for children and pass a robust Online Safety Bill in the coming weeks.

Trolls who encourage serious self-harm to face jail

New offence for encouraging serious self-harm with perpetrators facing 5 years behind bars

  • offence to apply regardless of whether target goes on to cause serious self-harm
  • move will protect vulnerable while not criminalising those who share their recovery journey

Vile trolls who hide behind the anonymity of the internet to encourage others to cause themselves serious harm will face prosecution as part of an overhaul of online safety laws announced today (18 May 2023).

Additions to the Online Safety Bill will make it a crime to encourage someone to cause serious self-harm, regardless of whether or not victims go on to injure themselves and those convicted face up to 5 years in prison.  The new offence will add to existing laws which make it illegal to encourage or assist someone to take their own life.

Police or prosecutors will only have to prove communication was intended to encourage or assist serious self-harm amounting to grievous bodily harm (GBH) – this could include serious injuries such as broken bones or permanent physical scarring.

The offence will apply even where the perpetrator does not know the person they are targeting – putting an end to abhorrent trolling that risks serious self-harm or life-changing injuries.

Encouraging someone to starve themselves or not take prescribed medication will also be covered.

Research from the Mental Health Foundation shows that more than a quarter of women between 16-24 have reported self-harm at some point in their life and since 1993 the levels of self-harm among women have tripled.  Today’s announcement is the latest step in our work to provide greater protections for women and girls who are more likely to self-harm.

Research also shows more than two-thirds of UK adults are concerned about seeing content that promotes or advocates self-harm while online.

Lord Chancellor and Justice Secretary, Alex Chalk KC, said: “There is no place in our society for those who set out to deliberately encourage the serious self-harm of others. Our new law will send a clear message to these cowardly trolls that their behaviour is not acceptable.

“Building on the existing measures in the Online Safety Bill our changes will make it easier to convict these vile individuals and make the internet a better and safer place for everyone.”

The new offence will be created following a recommendation from the Law Commission in 2021 and balances the need to protect vulnerable people while not criminalising those who document their own self-harm as part of their recovery journey.

Justice Minister, Edward Argar MP, said: “No parent should ever worry about their children seeing content online or elsewhere encouraging them to hurt themselves. Our reforms will punish those who use encourage vulnerable people to inflict serious injuries on themselves and make sure they face the prospect of time behind bars.

This new offence builds on measures already in the Online Safety Bill, which will better regulate social media and ensure that social media companies like Tiktok, Snapchat, Facebook, Instagram and others are held legally responsible for the content on their sites.

Over 3,000 child abuse crimes were recorded by Police Scotland in 5 years

  • NSPCC urges UK Government to seize last opportunity to strengthen Online Safety Bill so it creates online spaces for children safe from pervasive abuse

More than 3,100 child abuse image offences were recorded by Police in just five years, the NSPCC reveals as it calls for a more robust Online Safety Bill.

Last year, 662 crimes including the sharing and possession of indecent images of children were recorded by Police Scotland.1

The NSPCC warns that unregulated social media is fuelling online child sexual abuse and behind every offence could be multiple child victims who are continually revictimized as images are shared. 

They said the issue of young people being groomed into sharing images of their own abuse is pervasive and tech bosses are failing to stop their sites being used by offenders to organise, commit and share child sexual abuse.

The charity is calling on the UK Government to give children, including victims of sexual abuse, a powerful voice and expert representation in future regulation by creating a statutory child safety advocate through the Online Safety Bill.

This would ensure that children’s experiences are front and centre of decision making, building safeguarding experience into regulation to prioritise child protection. 

NSPCC analysis of data obtained by FOI from England and Wales police forces shows Snapchat is the social media site offenders most used to share child abuse images where platform data was provided. The app, popular with teens, was used in 43% of instances. Facebook, Instagram and WhatsApp, which are all owned by Meta, were used in a third (33%) of instances where a site was flagged.

And for the first-time virtual reality environments and Oculus headsets, used to explore the Metaverse, were found to be involved in recorded child sexual abuse image crimes.

The NSPCC said committing to a statutory child safety advocate is crucial to act as an early warning system to identify emerging child abuse risks and ensure they are on the radar of companies and the regulator Ofcom.

The advocate would reflect the experiences of young people and be a statutory counterbalance the power of the big tech lobby to help drive a corporate culture that focusses on preventing abuse.

Holly* called Childline in despair when she was 14. She said: “I am feeling sick with fear. I was talking with this guy online and trusted him. I sent him quite a lot of nude pictures of myself and now he is threatening to send them to my friends and family unless I send him more nudes or pay him.

“I reported it to Instagram, but they still haven’t got back. I don’t want to tell the police because my parents would then know what I did and would be so disappointed.”

Sir Peter Wanless, Chief Executive of the NSPCC, said: “These figures are alarming but reflect just the tip of the iceberg of what children are experiencing online.

“We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.

“By creating a child safety advocate that stands up for children and families the UK Government can ensure the Online Safety Bill systemically prevents abuse.

“It would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media.”

Online Safety Bill amendments

The NSPCC is seeking amendments to the Online Safety Bill as it passes through the House of Lords to improve its response to child sexual abuse.

They are asking Lords to back the creation of a child safety advocate which would mirror statutory user advocacy arrangements that are effective across other regulated sectors.

The amendment would give Ofcom access to children’s voices and experiences in real time via an expert child safety advocate akin to Citizen’s Advice acting for energy and postal consumers.

And after the UK Government committed to holding senior managers liable if their products contribute to serious harm to children the charity says this must also include where sites put children at risk of sexual abuse.

The move would mean bosses responsible for child safety would be held criminally liable if their sites continue to expose children to preventable abuse – which is backed by an overwhelming majority of the public.

Meta Encryption

In response to the latest data, the NSPCC also renewed calls on Meta to pause plans to roll out default end-to-end encryption of Facebook and Instagram messenger services in order to comply with future requirements of the Online Safety Bill.

They said Meta will turn a blind eye to child abuse by making it impossible to identify grooming and the sharing of images making the importance of external bodies such as a child safety advocate even more paramount.

However, the charity said the Online Safety Bill should be seen as an opportunity to incentivise companies to invest in technological solutions to end-to-end encryption that protect adult privacy, the privacy of sexual abuse victims and keep children safe.

NSPCC: Majority of Scots want tougher Online Safety Bill that holds tech bosses responsible for child safety

  • Survey shows public backing for senior tech managers to be held legally responsible for safety and liable if products cause serious harm to children
  • MPs, bereaved parents, and 2,192 campaigners in Scotland back calls to strengthen Online Safety Bill’s response to protecting children on social media
  • NSPCC estimates over 21,000 online child sexual offences recorded by police since legislation was delayed last summer

Four out of five (84%) adults in Scotland want senior tech managers to be appointed and held legally responsible for stopping children being harmed by social media, according to new polling of UK adults, of which 200 live in Scotland.

The survey by YouGov also found that 72% of those with an opinion in Scotland would want senior managers prosecuted for failures that resulted in serious harm to children.

The NSPCC, who commissioned the research, said the findings show overwhelming public support for tougher enforcement measures in the UK Government’s Online Safety Bill.

Currently, the legislation would only hold tech bosses responsible for failing to give information to the regulator Ofcom, and not for corporate decisions that result in preventable harm or sexual abuse.

The move is being supported by Ruth Moss, whose 13-year-old daughter Sophie died by suicide after viewing suicidal and self-harm posts and being groomed on social media.

The Edinburgh nurse has been campaigning with the NSPCC for several years for robust new legislation that would force tech bosses to make their sites safe for children.

Ruth Moss said: “As far as I’m concerned, where companies wilfully break the law and put the lives of children like my daughter at risk, of course senior managers should be criminally accountable. The consequences of non- compliance are life changing for children like Sophie.

“Criminal liability drives the right behaviours in those with the most responsibility. It works in other industries and there is no reason in my mind as to why big tech executives should be treated any differently.”

The Online Safety Bill has been subject to delays amid intense scrutiny in recent months as the UK Government amended elements relating to adult safety.

The Culture Secretary Michelle Donelan has repeatedly said protections for children would be strengthened and campaigners argue holding tech bosses liable for the safety of young users would send a signal of intent to Big Tech.

2,192 people in Scotland signed an open letter to Ms Donelan calling for the legislation to properly hold senior managers to account for the safety of sites children use.

Rachel Talbot, 15, from Angus in Scotland, who handed the letter into the Culture Secretary with other members of the NSPCC’s Young People’s Board for Change, said: “Far too much pressure is put on young people from such a young age to keep themselves safe online.

“Too many children are exposed to content promoting self-harm and eating disorders. It’s become a norm in our everyday lives.

“We need a Bill that is going to hold big tech firms accountable. Without it, young people are on their own. We’ve been on our own for so long online – and it’s not working.”

Some Conservative MPs are also calling on the Government to amend the Bill to hold senior managers liable for children’s safety when it returns to UK Parliament this month (January 16th).

Senior MPs including former Home Secretary Priti Patel, Sir William Cash and Miriam Cates are backing the amendment which would mean tech bosses would finally be held to account if their platforms contributed to the serious harm, abuse, or death of a child.

Campaigners say the UK risks being out of step as Irish laws passed last month will hold senior tech bosses liable for online safety changes.

But they argued that making the suggested changes would cement the UK as a global authority for children’s safety online.

Miriam Cates MP said: “It’s clear to most people that the big global tech companies are not going to wake up one day and suddenly decide to start protect children from harmful online content.

“We have seen repeated failures of Big tech to protect children from the horrors of sexual exploitation, pornography and content that draws them into self-harm and suicide, and sadly the Online Safety Bill as it stands will not stop this.

“The only way to secure the change we desperately need is to make senior directors personally responsible for failures to protect children and that’s why I urge all MPs to support this amendment to include senior manager liability in the Online Safety Bill.”

The amendment has cross-party support including from the Labour frontbench.

Shadow Culture Secretary Lucy Powell MP said: “Labour has long called for the online safety bill to be strengthened especially when it comes to the liability – including criminal liability – of social media bosses. Without these sanctions there’s a real risk that a UK regulator will be toothless.

“Yet instead of strengthening the laws, the Government has recently gutted and watered down the bill, letting social media companies off the hook and allowing harms, abuse and hate to continue.

“I welcome the campaigning work of the NSPCC to toughen this Bill.”

The NSPCC said senior managers must also be liable for preventing child sexual abuse that is taking place at a record scale online.

The charity estimates that 600 online child sexual abuse crimes will have been recorded by Police Scotland in the time the legislation was delayed in July until it is likely to pass through Parliament on January 16th.

Sir Peter Wanless, NSPCC Chief Executive, said: “2022 was the year the Online Safety Bill faced delay after delay while children faced sexual abuse on an industrial scale and tech bosses sat on their hands as their algorithms continued to bombard young users with hugely dangerous material.

“This year must be the year legislation delivers the systemic change for children online that our polling shows families up and down the UK want.

“The Government can do this by delivering bold, world-leading regulation that ensures the buck stops with senior management for the safety of our children.”

New laws to better protect victims from abuse of intimate images

Victims will be better protected from abusers who share intimate images without their consent, under a raft of changes to the law announced today (25 November 2022).

  • new offences to be created in crackdown on abusers who share intimate images without consent
  • changes will strengthen law and deliver on Prime Minister’s pledge to outlaw ‘downblousing’
  • comprehensive package of measures to modernise legislation following Law Commission review

Under a planned amendment to the Online Safety Bill, people who share so-called ‘deepfakes’ – explicit images or videos which have been manipulated to look like someone without their consent – will be among those to be specifically criminalised for the first time and face potential time behind bars.

The Westminster government will also bring forward a package of additional laws to tackle a range of abusive behaviour including the installation of equipment, such as hidden cameras, to take or record images of someone without their consent.

These will cover so-called ‘downblousing’ – where photos are taken down a woman’s top without consent – allowing police and prosecutors to pursue such cases more effectively.

This will deliver on Prime Minister Rishi Sunak’s pledge to criminalise the practice, in line with previous measures this government has taken to outlaw ‘upskirting’.

Deputy Prime Minister and Secretary of State for Justice, Dominic Raab, said: “We must do more to protect women and girls, from people who take or manipulate intimate photos in order to hound or humiliate them.

“Our changes will give police and prosecutors the powers they need to bring these cowards to justice and safeguard women and girls from such vile abuse.

Today’s announcement builds on the campaign of Dame Maria Miller MP, as well as recommendations from the Law Commission, to introduce reforms to the laws covering the abuse of images.

The amendment to the Online Safety Bill will broaden the scope of current intimate image offences, so that more perpetrators will face prosecution and potentially time in jail.

The Domestic Abuse Commissioner, Nicole Jacobs, said: “I welcome these moves by the government which aim to make victims and survivors safer online, on the streets and in their own homes.

“I am pleased to see this commitment in the Online Safety Bill, and hope to see it continue its progression through Parliament at the earliest opportunity.”

Around 1 in 14 adults in England and Wales have experienced a threat to share intimate images, with more than 28,000 reports of disclosing private sexual images without consent recorded by police between April 2015 and December 2021.

The package of reforms follows growing global concerns around the abuse of new technology, including the increased prevalence of deepfakes. These typically involve the use of editing software to make and share fake images or videos of a person without their consent, which are often pornographic in nature. A website that virtually strips women naked received 38 million hits in the first 8 months of 2021.

The government will take forward several of the Law Commission’s recommendations to ensure legislation keeps pace with technology and can effectively tackle emerging forms of abuse.

This includes:

  • Repealing and replacing current legislation with new offences to simplify the law and make it easier to prosecute cases. This includes a new base offence of sharing an intimate image without consent and 2 more serious offences based on intent to cause humiliation, alarm, or distress and for obtaining sexual gratification.
  • Creation of 2 specific offences for threatening to share and installing equipment to enable images to be taken.
  • Criminalising the non-consensual sharing of manufactured intimate images (more commonly known as deepfakes).

The move builds on government action in recent years to better protect victims and bring more offenders to justice, including making ‘upskirting’ and ‘breastfeeding voyeurism’ specific criminal offences, extending ‘revenge porn’ laws to capture threats to share such images, and using the Online Safety Bill to create an offence specifically targeting ‘cyberflashing’.

Ruth Davison, CEO of Refuge, said: “Refuge welcomes these reforms and is pleased to see progress in tackling abuse perpetrated via technology. As the only frontline service with a specialist tech abuse team, Refuge is uniquely placed to support survivors who experience this form of abuse.

“We campaigned successfully for threatening to share intimate images with intent to cause distress to be made a crime, via the Domestic Abuse Act, and these reforms will further ensure police and law enforcement agencies rightly investigate and prosecute these serious offences.

“Tech abuse can take many forms, and Refuge hopes that these changes will signal the start of a much broader conversation on the need for strengthening the response to online abuse and harm.”

DCMS Secretary of State Michelle Donelan said: “Through the Online Safety Bill, I am ensuring that tech firms will have to stop illegal content and protect children on their platforms, but we will also upgrade criminal law to prevent appalling offences like cyberflashing.

“With these latest additions to the Bill, our laws will go even further to shield women and children, who are disproportionately affected, from this horrendous abuse once and for all.

“The government will bring forward the wider package of changes as soon as parliamentary time allows and will announce further details in due course.”

Keeping all devices in one room can protect your child online

Children have more access to screen time than ever before, in particular, access to the internet.

Internet safety has become an increasingly worrying problem amongst parents, however internet expert Allison Troutner from VPNOverview.com has listed the best ways to keep your child safe online.

1. Consider a family ‘tech agreement’

One way to set ground rules with your child is to create a Family Tech Agreement. A family tech agreement answers as many questions as possible about internet and device use so boundaries are clear to all family members. It’s a good way for the whole family to talk about safe and responsible online behaviours.

To create a family agreement, discuss topics like:

  • What apps, games, or sites does the family use most?
  • What rules do we want to include in our agreement?
  • How long should we spend on our devices?
  • What information is safe to share (or not)?
  • What do we do if we see something inappropriate?
  • What email address do we use to sign up for accounts?
  • Do we know how to use in-app safety features like blocking and reporting?
  • Who can we talk to if we feel uncomfortable with something online?
  • Who is safe to talk to?
  • What happens when someone breaks the agreement?
  • When might parents be forced to break the agreement for safety?

This is a starting point: your family may discuss more topics on internet safety for kids depending on the ages of your child or teens and what devices you use.

2. Report any harmful content that you see

Flag or report all harmful content or contact you or your child experiences using social media apps using in-app reporting features. For cybercrimes, cyberbullying, or harmful content, use in-app features like Twitter’s safe mode to report it. Most social media companies have their own safety and privacy policies and will investigate and block content or users. Apps geared towards kids, like Facebook Messenger Kids, have clear guidelines and safety features so that users can block content or contacts and have a safer experience in the app.

3. Balance safety with independence

Technical controls can be a useful way to protect your children online but they can’t solve all your problems. Children need a certain amount of freedom and privacy to develop healthily. They need their own free space to learn by trial and error what works and what doesn’t. So keep balancing, it’s part of it. Having open and honest conversations with your children can be the best way to balance this safety.

4. Keep the computer in a common space

If possible, keep computers and devices in a common space so you can keep an eye on activity. It prevents children from doing things that might be risky. Also, if harmful or inappropriate content appears through messages, you can address it with your child straight away.

5. Password-protect all accounts and devices

From phones to computers to apps, put a password on it. That way, no one without the password can access you or your child’s device. Keep track of passwords by using a password manager.

6. Update your operating systems regularly

All of your devices from mobile phones or tablets to computers and smartwatches receive important updates in response to security issues on a regular basis. Be sure to install them regularly so you have the most up-to-date security fixes and remain safe online. Our recommendation is to set updates to install automatically so your device is less vulnerable to known attacks. Usually, you can find this feature in Settings, then select Automatic Updates, but it varies between devices.

7. Install security or antivirus software programs and a VPN on your computer

Additionally, cybersecurity or antivirus software programs prevent spyware or viruses that may harm your computer if your child visits a malicious site. Using these programs, parents can also set up regular virus checks and deep system scans to make sure there is no harmful activity happening under your nose.

A VPN hides users’ internet activity from snoops and spoofs your location. This protects your kids by making sure hackers or predators can’t detect their actual location. You can install a VPN on your router so that the location is spoofed on all connected devices. 

8. Set parental controls

It may seem obvious, but parental controls are crucial to your child’s safety online. Parent controls are built-in features included on devices and apps. With these features, parents customise their child’s online experience. What parental controls are available on each device or app varies, but in general, they limit screen time, restrict content, and enhance user privacy.

Features of parental controls:

  • Limit screen time.
  • Turn off in-app purchasing.
  • Prevent inappropriate or mature content.
  • Limit website access.
  • Play, message, or send/receive content with approved contacts only.
  • Monitor device location through GPS.

Take time to look at what parental controls are available on your child’s commonly used apps. Then, set them to reflect the type of experience you think is best for your child or teen’s online safety.

IET raises concerns Online Safety Bill ‘does not go far enough’

A joint comment from Catherine Allen, co-author of the IET’s Safeguarding the metaverse report and member of the IET’s Digital Policy Panel, and child safety advocate and IET Honorary Fellow, Carol Vorderman M.A.(Cantab) MBE:

“Today’s harrowing verdict in the Molly Russell case has once again highlighted the urgent need for policy makers to take emerging technologies that pose a serious safety risk to individuals, most notably children, seriously. It is vital legislation within the new Online Safety Bill fully protects children from online harms, particularly unregulated content. It currently does not go far enough and this is dangerous.  

“We’ve already had a delay in legislation, now it seems aspects of the Bill relating to children will remain untouched. The rapid speed in which online platforms evolve, such as experiential environments accessed via virtual and augmented reality, mean new threats emerge daily. There is currently no provision within the Bill for safeguarding online users in ‘live’ scenarios where they can fully interact with strangers.

“Whether it is social media, a virtual reality headset or a metaverse gaming platform, politicians must avoid trivialising or feeling mystified by new technology platforms. Yes, there are complex factors to consider, like protecting our existing rights to freedom of expression, but that doesn’t mean we can delay addressing underlying problems.

We must fully safeguard the metaverse, and protect individuals online.”

More than 100 online child abuse crimes in Scotland every month Online Safety Bill delayed, NSPCC warns

  • Charity urges next Prime Minister to keep the promise made to children and families and commit to passing Online Safety Bill as a national priority
  • NSPCC say children will carry the considerable cost of further delay to social media regulation

More than 100 online sex crimes will take place against children in Scotland every month the Online Safety Bill is delayed, NSPCC research indicates.

The charity’s analysis of Police Scotland crime data found that online child sexual abuse offences had more than doubled over the last decade.

The data shows 1,298 Indecent Image offences and crimes of Communicating Indecently with a Child were logged in the year to March– up from 543 just ten years ago.

The NSPCC said the growth in crimes and the scale of abuse taking place against children should serve as a wake-up call for the next UK Prime Minister to make the Online Safety Bill a national priority.

The charity said it underlines the urgent need for Liz Truss and Rishi Sunak to commit to passing the legislation in full and without delay.

It warned the disturbing reality of delay is more children being groomed on their smartphones and tablets, being contacted by offenders in the summer holidays, and coerced into acts of online sexual abuse in their bedrooms.

The landmark Online Safety Bill was due to pass through the House of Commons last week but was postponed until at least the autumn when a new Prime Minister will be in place.

The NSPCC first secured the commitment to regulate social media four years ago in a bid to combat the inaction of Silicon Valley to abuse taking place against children on their platforms.

The legislation would put a duty of care on companies for their users and mean they would have to put measures in place to prevent and disrupt child abuse on their sites and protect children from harm.

The charity is concerned the delay could result in the Bill being watered down despite years of failed self-regulation by tech firms putting children at increased risk.

Frida*, who is a survivor of online abuse, said: “The abuse that I experienced started ten years ago when I was 13. It is sickening that since then the number of young people being abused online has grown dramatically.

“Being groomed has had a horrific impact on my life and I want no other young person to endure that. I know this delay to the Online Safety Bill will see more young people like me experience harm when it could have been prevented, and that is devastating.”

The NSPCC has written to both Conservative leadership candidates saying, ‘delay or watering down of the Bill will come at considerable cost to children and families. It would represent the reversal of an important manifesto commitment that commands strong levels of public support’.

YouGov research for the NSPCC found more than four fifths of UK adults think the Online Safety Bill should deliver strong and comprehensive measures to protect children from online child sexual abuse.

NSPCC Chief Executive, Sir Peter Wanless, said: With every second the clock ticks by on the Online Safety Bill an ever-growing number of children and families face the unimaginable trauma of preventable child abuse.

“The need for legislation to protect children is clear, commands overwhelming support from MPs and the public and builds on the UK’s global leadership position in tackling harm online. Robust regulation can be delivered while protecting freedom of speech and privacy.

“There can be no more important mission for Government than to keep children safe from abuse and the next Prime Minister must keep the promise made to families in the election manifesto and deliver the Online Safety Bill as a national priority.”

New poll finds 7 in 10 adults want social media firms to do more to tackle harmful content

Ipsos study finds over 4 in 5 adults are concerned about harmful content online

  • 68 per cent want more action from social media firms on racism, homophobia and misogyny on their platforms
  • Comes as the Online Safety Bill moves to Report Stage in the House of Commons this week

A clear majority of the public want social media companies to do more to protect their users from harmful content, according to new research published today.

Polling by Ipsos shows over four in five (84 per cent) adults in the UK are concerned about seeing harmful content – such as racism, misogyny, homophobia and content that encourages self-harm – with two in five (38 per cent) reporting having seen it in the last month. This comes as the Online Safety Bill moves to Report Stage in Parliament this week.

The government commissioned study found strong public support for the measures contained in the Bill. For instance, seven in ten adults (68 per cent) believe social media companies should do more to protect people online.

Four in five adults (78 per cent) want social media companies to be clear about what sort of content is and isn’t allowed on their platform.

In a stark warning to social media companies, 45 per cent of respondents also said they will leave or reduce the amount of time they spend on their platforms if they see no action.

Digital Secretary Nadine Dorries said: “Online abuse has a devastating impact on people’s lives, and these findings definitively show the public back our plans which will force social media companies to step up in keeping their users safe.

“It is clear people across the UK are worried about this issue, and as our landmark Online Safety Bill reaches the next crucial stage in Parliament we’re a big step closer to holding tech giants to account and making the internet safer for everyone in our country.”

The survey also found that women have high levels of concern about legal but harmful content, with 45 per cent feeling unsafe when talking to people on dating or messaging apps.

Most women (65 per cent) agree there should be limits to the types of content people can post online. Nearly half (47 per cent) of those living in households with at least one child report having seen abusive content in the last month.

The safety of women and girls across the country is a top priority. The measures we’re introducing through the Online Safety Bill will mean tech companies have to tackle illegal content and activity on their services, women will have more control over who can communicate with them and what kind of content they see on major platforms, and they will be better able to report abuse.

In addition, we are continuing to implement our Tackling Violence Against Women and Girls (VAWG) strategy to bring about real and lasting change offline as well as online.

The Online Safety Bill was introduced to Parliament in March and is a major milestone in the government’s mission to make the UK the safest place in the world to be online. The new laws will protect children, tackle illegal content and protect free speech, as well as requiring social media platforms to uphold their stated terms and conditions.

If they don’t, the regulator Ofcom will work with platforms to ensure they comply and will have the power to fine companies up to ten per cent of their annual global turnover – which could reach billions of pounds – to force them to fulfil their responsibilities or even block non-compliant sites.

When the Bill comes into force, firms will be required to identify and implement solutions to protect their users. Firms hosting content that is harmful to children such as pornography, will have to prevent them from accessing it, for example by using age verification.

Social media platforms will also be required to safeguard people’s free speech, and their access to journalism and content that is democratically important. The poll follows the announcement of a series of amendments to the Bill last week to strengthen protections for freedom of speech, including tougher protections to guard against the arbitrary removal of articles from recognised news outlets shared on social media.

Last week the government published the list of legal but harmful content social media companies will need to address under the Online Safety Bill.

The categories consist of types of online abuse and harassment which can fall below the threshold of a criminal offence, but which still cause significant harm to adults online such as misogyny, homophobia and content that encourages self-harm.

This threshold is important to ensure that the online safety framework focuses on content and activity which poses the most significant risk of harm to UK users online. 

Free speech within the law can involve the expression of views that some may find offensive, but a line is crossed when disagreement mutates into abuse or harassment, which refuses to tolerate other opinions and seeks to deprive others from exercising their free speech and freedom of association.

Online Safety Bill amendment: No hiding place for child sex offenders

Greater powers to tackle child sexual abuse online will be introduced through an amendment to the Online Safety Bill, the Home Secretary announced yesterday (Wednesday 6 July 2022).

The amendment will give Ofcom extra tools to ensure technology companies take action to prevent, identify and remove harmful child sexual abuse and exploitation (CSAE) content.

Ofcom, the UK’s regulatory authority for telecommunications, will be able to demand that technology companies such as social media platforms roll out or develop new technologies to better detect and tackle harmful content on their platforms. If they fail to do so, Ofcom will be able to impose fines of up to £18 million or 10% of the company’s global annual turnover, depending on which is higher.

Home Secretary, Priti Patel said: “Child sexual abuse is a sickening crime. We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe.

“Privacy and security are not mutually exclusive – we need both, and we can have both and that is what this amendment delivers.”

The National Crime Agency estimate there are between 550,000 to 850,000 people in the UK who pose a sexual risk to children. In the year to 2021, there were 33,974 obscene publications offences recorded by the police, and although some improvements have been made, it is still too easy for offenders to access harmful content online.

Access to such content online can lead to offenders normalising their own consumption of this content, sharing methods with each other on how to evade detection, and escalation to committing contact child sexual abuse offences.

Digital Minister, Nadine Dorries said: “Tech firms have a responsibility not to provide safe spaces for horrendous images of child abuse to be shared online. Nor should they blind themselves to these awful crimes happening on their sites.

Rob Jones, NCA Director General for child sexual abuse, said: “Technology plays an extremely important part in our daily lives and its benefits are undeniable.

“But it is also a fact that online platforms can be a key tool in a child abuser’s arsenal. They use them to view and share abuse material, seek out and groom potential victims, and to discuss their offending with each other.

“Identifying these individuals online is crucial to us uncovering the real-world abuse of children.

“We are taking significant action in this space and, alongside UK policing, we are making record numbers of arrests and safeguards every month.

“While this will always be a priority, we need tech companies to be there on the front line with us and these new measures will ensure that.”

Sir Peter Wanless, NSPCC Chief Executive, said: “We need urgent action to protect children from preventable online abuse. Our latest analysis shows online grooming crimes have jumped by more than 80% in four years.

“The Online Safety Bill is a once-in-a-generation opportunity to ensure children can explore the online world safely.

“This amendment will strengthen protections around private messaging and ensure companies have a responsibility to build products with child safety in mind. This positive step shows there doesn’t have to be a trade-off between privacy and detecting and disrupting child abuse material and grooming.”

The amendment will support innovation and the development of safety technologies across the technology industry and will incentivise companies in building solutions to tackle CSEA which are effective and proportionate.

The government-funded Safety Tech Challenge Fund is demonstrating that is it is possible to detect child sexual abuse material in end-to-end encrypted environments, while respecting user privacy.

You can also read the Home Secretary’s op-ed for the The Telegraph.