NSPCC lays out six tests to create world-leading laws to protect children online

  • Charity at the forefront of the Online Harms Bill urges UK Government to deliver on Boris Johnson’s determination for ambitious regulation
  • NSPCC sets 11th hour demand for Government: ‘Pass our tests for online regulation so children don’t continue to suffer avoidable harm and abuse’
  • Online sex crimes recorded against children in Scotland surpass five a day; as Ian Russell backs calls for Bill to also tackle suicide and self-harm posts

The NSPCC has laid out six tests the UK Government’s regulation of social media will be judged on if it is to achieve bold and lasting protections for children online.

The charity’s How to win the Wild West Web report, released today, sets out how the upcoming Online Harms Bill must set the global standard in protecting children on the web.

With crucial decisions just days away, the charity is urging the UK Government to ensure it levels the playing field for children, and new laws finally force tech firms to tackle the avoidable harm caused by their sites.

The call comes after figures released by Police Scotland show the number of online sex crimes against children recorded by the force during lockdown (April – June) reached the equivalent of more than five a day – a 20% increase on the same quarter last year.

The pandemic is likely to result in long-term changes to the online child abuse threat, with high-risk livestreaming and video chat becoming more popular. Changes to working patterns, meaning more offenders working at home, could result in a greater demand for sexual abuse images and increased opportunities for grooming.

The NSPCC has routinely highlighted the growing levels of abuse and harm caused to children on social media platforms, and believes the problem has been exacerbated by the fallout from coronavirus.

At the UK Government’s Hidden Harms summit earlier this year, the Prime Minister signalled his personal determination to legislate for ambitious regulation that successfully combats child abuse.

But the NSPCC is worried the landmark opportunity to change the landscape for children online could be missed if this is not translated by the UK Government into law.

The charity has released its six tests ahead of a full consultation response to the White Paper, amid concerns Ministers are wavering in their ambitions for robust regulation.

Regulation must:

  1. Create an expansive, principles-based duty of care
  2. Comprehensively tackle online sexual abuse
  3. Put legal but harmful content on an equal footing with illegal material
  4. Have robust transparency and investigatory powers
  5. Hold industry to account with criminal and financial sanctions
  6. Give civil society a legal voice for children with user advocacy arrangements

The charity believes, if done correctly, regulation could set a British model that leads the world in child protection online.

But in a stark warning, NSPCC CEO Peter Wanless, said: “Failing to pass any of the six tests will mean that rather than tech companies paying the cost of their inaction, future generations of children will pay with serious harm and sexual abuse that could have been stopped.

“Industry inaction is fuelling sex crimes against children and the fallout from coronavirus has heightened the risks of abuse now and in the future.

“The Prime Minister has the chance of a lifetime to change this by coming down on the side of children and families, with urgent regulation that is a bold and ambitious UK plan to truly change the landscape of online child protection.

“The Online Harms Bill must become a Government priority, with unwavering determination to take the opportunity to finally end the avoidable, serious harm children face online because of unaccountable tech firms.”

The six tests are backed by Ian Russell, who has campaigned for regulation since the death of his daughter, Molly, by suicide, after she was targeted with self-harm posts on social media.

Mr Russell, who is due to be made an Honorary Member of Council for the NSPCC this week, said: “Today, I can’t help but wonder why it’s taking so long to introduce effective regulation to prevent the type of harmful social media posts we now know Molly saw, and liked, and saved in the months prior to her death.

“Tech self-regulation has failed and, as I know, it’s failed all too often at great personal cost. Now is the time to establish a regulator to protect those online by introducing proportionate legislation with effective financial and criminal sanctions.

“It is a necessary step forward in trying to reclaim the web for the good it can do and curtail the growing list of harms to be found online.”

The six tests the Government must pass if it is to create game-changing and lasting protections for children online are:

  • An expansive, principles-based duty of care; tech firms should have a legal responsibility to identify harms caused by their sites and deal with them, or face tough consequences for breaching regulation.
  • Tackling online sexual abuse; platforms must proactively and consistently tackle grooming and abuse images facilitated by dangerous design features. There must be no excuses. In the current state of play abuse images have been left online with the excuse that a child’s age cannot be proven, and images signposting abuse are not removed.
  • Tackling legal but harmful content; current Government proposals will see companies set their own rules on legal but harmful content. This is not good enough. The law must compel firms to respond to the harms caused by algorithms targeting damaging suicide and self-harm posts at children and avoid a two-tier system that prioritises tacking illegal content. The danger of harmful content should rightly be balanced against freedom of expression, but focus on the risk to children.
  • Transparency and investigation powers; tech firms currently only dish out information they want the public to see. The regulator must have the power to lift up the bonnet to investigate platforms and demand information from companies.
  • Criminal and financial sanctions; fines are vital but will be water off a duck’s back to some of the world’s wealthiest companies. Government can’t backslide on a named manager scheme that gives the regulator powers to prosecute rogue tech directors in UK law.
  • User advocacy arrangements; to level the playing field there must be strong civil society voice for children against well-resourced industry pressure. Big tech should be made to clean up the damage they have caused by funding user advocacy arrangements.

The NSPCC has been the leading voice for social media regulation and the charity set out detailed proposals for an Online Harms Bill last year, which informed much of  the White Paper.

The Government has said the consultation response will be published in the autumn, with legislation expected to be delivered in the new year.