Secretary of State Liz Kendal’s statement after concerns over Grok AI

STATEMENT TO PARLIAMENT – 12 JANUARY 2026

With permission Madam Deputy Speaker, I would like to make a statement on AI, social media and online safety.  

No woman or child should live in fear of having their image sexually manipulated by technology.  

Yet in recent days, the Grok AI tool on the social media platform X has been used to create and share degrading, non-consensual intimate deepfakes.     

The content which has circulated on X is vile. It is not just an affront to decent society – it is illegal.   

The Internet Watch Foundation (IWF) reports “criminal imagery” of children as young as 11, including girls sexualised and topless.  

This is Child Sexual Abuse.  

We’ve seen reports of photos being shared of women in bikinis, tied up and gagged, with bruises, covered in blood. And much, much more. 

Lives can and have been devastated by this content, which is designed to harass, torment, and violate people’s dignity.   

They are not harmless images – they are weapons of abuse, disproportionately aimed at women and girls.  

And they are illegal.  

Last week, X limited the image creation function to paid subscribers.  

This does not go anywhere near far enough.  

It is insulting to victims to say you can still have this service if you are willing to pay.

And it is monetising abuse.  

So let me be crystal clear: sharing, or threatening to share, a deepfake intimate image without consent – including images of people in their underwear – is a criminal offence.    

Under the Online Safety Act, sharing images – or threatening to share them – is a criminal offence. For individuals, and for platforms.  

My predecessor – the Right Honourable Member for Hove and Portslade – made this a ‘priority offence’, so services have to take proactive action to stop this content from appearing in the first place.  

The Data Act, passed last year, made it a criminal offence to create – or request the creation of – non-consensual intimate images.  

And today, I can announce to the House that this offence will be brought into force this week and that I will make it a priority offence in the Online Safety Act too.  

This means individuals are committing a criminal offence if they create – or seek to create – such content – including on X – and anyone who does this should expect to face the full extent of the law.   

But the responsibilities do not just lie with individuals for their own behaviour.  

The platforms that host such material must be held accountable – including X.  

Madam Deputy Speaker, Ofcom this morning confirmed that they have opened a formal investigation into X and will assess their compliance with the Online Safety Act.     

The government expects Ofcom to set out a timeline for the investigation as soon as possible.  

The public – and most importantly, the victims of Grok’s activities – expect swift and decisive action. So this must not take months and months.  

But X doesn’t have to wait for the Ofcom investigation to conclude. They can choose to act sooner to ensure this abhorrent and illegal material cannot be shared on their platform.    

If they do not, Ofcom will have the backing of this government to use the full powers which Parliament has given them.  

And I would remind X – and all other platforms – that this includes the power to issue fines worth millions of dollars, or 10% of a company’s qualifying worldwide revenue.   

And in the most serious cases, Ofcom can apply for a court order to stop UK users accessing the site.  

Madam Deputy Speaker, this government will do everything in our power to keep women and especially children safe online.  

So I can today confirm that we will build on all the measures I have already outlined and legislate in the Crime and Policing Bill – which is currently going through Parliament – to criminalise nudification apps.  

This new criminal offence will make it illegal for companies to supply tools designed to create non-consensual intimate images, targeting the problem at its source.      

And in addition to all of these actions, we expect technology companies to introduce the steps recommended by Ofcom’s guidance on how to make platforms safer for women and girls without delay.  

And if they do not, I am prepared to go further.  

Because this government believes tackling violence against women and girls is as important online as it is in the real world.  

Madam Deputy Speaker, this is not – as some would claim – about restricting freedom of speech, something I and the whole government hold very dear.  

It is about tackling violence against women and girls.  

It’s about upholding basic British values of decency and respect, and ensuring the standards we expect offline are upheld online.  

And it is about exercising our sovereign power and responsibility to uphold the laws of the land.  

I hope this is a time when MPs on all sides of the House will stand up for British laws and British values and call out the platforms that allow explicit, degrading and illegal content.   

It is time to choose a side.  

If I may Madam Deputy Speaker, I would also like to address calls from MPs on all sides of this House for the government to end its participation on X.  

I understand why many colleagues have come to this conclusion when X seems so unwilling to clean up its act. The government will of course keep our participation under review.  

But our job is to protect women and girls from illegal and harmful content wherever it is found.  

It is also worth bearing in mind, with 19 million people on X in this country, and more than a quarter using it as their primary source of news, that our views – and often simply the facts – need to be heard.  

Madam Deputy Speaker, let me conclude by saying this.  

AI is a transformative technology which has the potential to bring about extraordinary and welcome change.  

Creating jobs and growth. Diagnosing and treating diseases. Helping children learn at school. Tackling climate change. And so much more besides.  

But in order to seize these opportunities, people must feel confident that they and their children are safe online and that AI is not used for destructive and abusive ends.  

Many tech companies want to and are acting responsibly. But when they do not, we must and we will act.  

Innovation should serve humanity; not degrade it.   

So we will leave no stone unturned in our determination to stamp out these demeaning, degrading and illegal images.   

If that means strengthening the existing laws, we are prepared to do so.   

Because this government stands on the side of decency.  

We stand on the side of the law.   

We stand for basic British values supported by the vast majority of people in this country.  

And I commend this statement to the House.

NO EXCUSES: Statement on xAI’s Grok image generation and editing tool

Technology Secretary Liz Kendall calls for swift action after reports xAI’s Grok tool continues to allow generation of intimate deepfake images

The Technology Secretary has commented on the changes xAI has implemented to its chatbot overnight, and government action to stamp out this form of abuse.

Technology Secretary Liz Kendall said: “Sexually manipulating images of women and children is despicable and abhorrent. It is an insult and totally unacceptable for Grok to still allow this if you’re willing to pay for it. I expect Ofcom to use the full legal powers Parliament has given them.

“I, and more importantly the public – would expect to see Ofcom update on next steps in days not weeks.

“I would remind xAI that the Online Safety Act includes the power to block services from being accessed in the UK, if they refuse to comply with UK law. If Ofcom decide to use those powers they will have our full support.

“We will be banning nudification apps in the Crime and Policing Bill which is in parliament now.

“We are in the coming weeks bringing in to force powers to criminalise the creation of intimate images without consent.

“I expect all platforms to abide by Ofcom’s new Violence Against Women and Girls (VAWG) guidance and if they do not, I am prepared to go further.

“We are as determined to ensure women and girls are safe online as we are to ensure they are safe in the real world. No excuses.”

Keeping children safe online: Changes to the Online Safety Act explained

How new laws that keep children safe on the internet work

Keeping children safe

The way children experience the internet has fundamentally changed, as new laws under the Online Safety Act have come into force to protect under-18s from harmful online content they shouldn’t ever be seeing. This includes content relating to:

  • pornography
  • self-harm
  • suicide
  • eating disorder content

Ofcom figures show that children as young as 8 have accessed pornography online, while 16% of teenagers have seen material that stigmatises body types or promotes disordered eating in the last 4 weeks.   

To protect the next generation from the devastating impact of this content, people now have to prove their age to access pornography or this other harmful material on social media and other sites.    

Platforms are required to use secure methods like facial scans, photo ID and credit cards checks to verify the age of their users. This means it will be much harder for under-18s to accidentally or intentionally access harmful content. 

It’s clear in Ofcom’s codes that we expect platforms to ensure that strangers have no way of messaging children. This includes preventing children from receiving DMs from strangers and children should not be recommended any accounts to connect with.  

Data privacy

While people might see more steps to prove their age when signing up or browsing age-restricted content, they won’t be compromising their privacy.    

The measures platforms have to put in place must confirm your age without collecting or storing personal data, unless absolutely necessary. For example, facial estimation tools can estimate your age from an image without saving that image or identifying who you are. Many third-party solutions have the ability to provide platforms with an answer to the question of whether a user is over 18, without sharing any additional data relating to the user’s identity. 

 The government and the regulator, Ofcom, are clear that platforms must use safe, proportionate and secure methods, and any company that misuses personal data or doesn’t protect users could face heavy penalties.

Services must also comply with the UK’s data protection laws. The Information Commissioner’s Office (ICO) has set out the main data protection principles that services must take into account in the context of age assurance, including minimising personal data which is collected for these purposes.  

Virtual Private Networks

While Virtual Private Networks (VPNs) are legal in the UK, according to this law, platforms have a clear responsibility to prevent children from bypassing safety protections. This includes blocking content that promotes VPNs or other workarounds specifically aimed at young users.   

This means that where platforms deliberately target UK children and promote VPN use, they could face enforcement action, including significant financial penalties.  

The Age Verification Providers Association (AVPA) reports that there has been an additional 5 million age checks on a daily basis as UK-based internet users seek to access sites that are age-restricted.

Online Safety laws do not ban any legal adult content. Instead, the laws protect children from viewing material that causes real harm in the offline world, devastating young lives and families.    

Under the Act, platforms should not arbitrarily block or remove content and instead must take a risk-based, proportionate approach to child safety duties.

Protecting freedom of speech?

As well as legal duties to keep children safe, the very same law places clear and unequivocal duties on platforms to protect freedom of expression. Failure to meet either obligation can lead to severe penalties, including fines of up to 10% of global revenue or £18 million, whichever is greater.

The Act is not designed to censor political debate and does not require platforms to age gate any content other than those which present the most serious risks to children such as pornography or suicide and self-harm content.

Technology Secretary Peter Kyle said: This marks the most significant step forward in child safety since the internet was created.

“The reality is that most children aren’t actively seeking out harmful, dangerous, or pornographic content – unfortunately it finds them. That’s why we’ve taken decisive action.

“Age verification keeps children safe. Rather than looking for ways around it, let’s help make the internet a safer, more positive space for children – and a better experience for everyone. That’s something we should all aspire to.”

Support for the Online Safety Act

NSPCC Chief Executive, Chris Sherwood: “We regularly hear from children who have suffered sexual and emotional abuse online, or who have been exposed to harmful and dangerous content.

“These experiences can have devastating impacts both immediately and long into the future. While the Online Safety Act can’t erase this pain and anger, it can be a vehicle for significant and lasting change.

“Thanks to this piece of ground-breaking regulation, algorithms are now being redesigned. Age checks are now in place. Harmful material that promotes eating disorders and suicide should no longer proliferate on social media platforms.

“This will – without a doubt – create safer, more age-appropriate online experiences for young users across the UK.”

Barnardo’s CEO, Lynne Perry: “These new protections are an important stepping stone towards making sure that children are safer online.

“They must be robustly enforced.”

Internet Matters: “Today marks an important milestone for children’s online safety […] towards ensuring that online services are designed with children’s safety in mind – from limiting children’s exposure to harmful content to creating age-appropriate experiences. 

“This milestone matters because the risks children face online remain high. Our latest Internet Matters Pulse shows that 3 in 4 children aged 9-17 experience harm online, from exposure to violent content to unwanted contact from strangers.

“With the Codes now enforceable, Ofcom must hold platforms accountable for meeting their obligations under the law.”

Mental Health Awareness Week: Call for urgent reform

This Mental Health Awareness Week, we’re calling on the UK government for urgent reform to protect young people’s mental health. 📢

Today, we’re heading to parliament to raise awareness of the benefits of positive communities – both online and offline – for our mental health.

We’ll also be shining a light on the dangers of digital spaces, and what we must do make online communities safer – particularly for young people.

While there are many supportive and uplifting online communities, there are also harmful ones that promote hatred, self-harm, and dangerous misinformation.

The mental health impacts of these negative environments can be catastrophic. So it’s essential that the government takes action to make these spaces safer, while we also learn about how we can embrace the good, and avoid the bad.

Find out more: https://bit.ly/4da0Ggs

#MentalHealthAwarenessWeek

#ThisIsMyCommunity

Child sexual abuse image crimes at record high in Scotland last year

  • Child sexual abuse image offences recorded by Police Scotland increased by 15 per cent between April 2022 and March 2023
  • NSPCC wants robust implementation of the Online Safety Act with Ofcom encouraged to strengthen its approach to tackling child sexual abuse
  • Meta urged to pause rollout of end-to-end encryption until plans for Facebook and Instagram can be risk assessed under new online safety regulations

The number of child sexual abuse image offences recorded by Police Scotland were at a record high last year – up by 15 per cent from the previous year, data analysed by the NSPCC has revealed.

A total of 765 offences where child abuse images were collected and distributed, were logged in 2022/23 according to Police Scotland data 1.  

Since 2017/18, when the NSPCC first called for social media regulation, a total of 3,877 crimes have been recorded while children and families have waited for online safety laws.

The charity said the figures show the need for swift and ambitious action by tech companies to address what is currently happening on their platforms and for Ofcom to significantly strengthen its approach to tackling child sexual abuse through effective enforcement of the Online Safety Act.

The figures come as insight from Childline shows young people being targeted by adults to share child sexual abuse images via social media and the calculated use of end-to-end encrypted private messaging apps by adults to find and share child abuse images.

A 14-year-old girl told the NSPCC-run service: “One night I got chatting with this guy online who I’d never met and he made me feel so good about myself. He told me he was 15, even though deep down I didn’t believe him.

“I sent him a couple of semi-nudes on Snap(chat), but then instantly regretted it. I asked him to delete the pics, but he just kept on making me do stuff for him not to post them – like getting me to strip live on camera. I just want to block him, but if I block him he will just post the pictures.”

A 15-year-old boy told Childline: “A while ago I saw a video on YouTube about how a guy was busting paedophiles and creeps on the internet by pretending to be a kid, and I kind of wanted to do a similar thing.

“I looked around Instagram for the creepiest accounts about kids my age and younger. In the end, I came across this link on one of their stories. It’s a link to a WhatsApp group chat in which [child sexual abuse material] is sent daily! There are literally hundreds of members in this group chat and they’re always calling the kids ‘hot’ and just being disgusting.”

  1. Police Scotland recorded crime data on the Scottish Government website.
Police Force2017/182018/192019/202020/212021/222022/23Total
Scotland6585545846606627653877

Online Safety Act implementation

The NSPCC said that disrupting online child sexual abuse taking place at increasing levels will require regulated tech platforms to introduce systemic changes to their products to stop them being used to organise, commit, and share child abuse.

A consultation into Ofcom’s first codes for companies to adopt to disrupt child sexual abuse on their platforms closed last week.

The NSPCC want these measures introduced without delay but urged Ofcom to begin work on a second version of the codes that will require companies to go much further.

The charity said companies should be required to use technology that can help identify and tackle grooming, sextortion and new child abuse images.

They also want tougher measures for private messaging services to make child protection a priority, including in end-to-end encrypted environments.

The NSPCC warned that Meta’s roll-out of end-to-end encryption on Facebook and Instagram will prevent authorities from identifying offenders and safeguarding victims.

The charity wants plans paused until Meta can prove child safety will not be compromised and have urged parties to find a balance between the safety and privacy of all users, including children. The NSPCC said further rollout should be delayed until Ofcom can study Meta’s risk assessment as part of the new regulatory regime.

Sir Peter Wanless, NSPCC Chief Executive, said: “It’s alarming to see online child abuse continue to rise, especially when tech companies should be acting to make their sites safe by design ahead of incoming regulation.

“Behind these crimes are children who have been targeted by adults who are able to organise and share sexual abuse with other offenders seamlessly across social media and messaging apps.

“The Online Safety Act sets out robust measures to make children fundamentally safer on the sites and apps they use so they can enjoy the benefits of a healthy online experience.

“Ofcom has been quick off the blocks but must act with greater ambition to ensure companies prioritise child safety in the comprehensive way that is so desperately needed.”

Susie Hargreaves OBE, Chief Executive of the Internet Watch Foundation, the UK’s front line against child sexual abuse imagery online, said: “This is a truly disturbing picture, and a reflection of the growing scale of the availability, and demand, for images and videos of children suffering sexual abuse.

“The people viewing and sharing and distributing this material need to know it is not a victimless crime. They are real children, suffering real abuse and sexual torture, the effects of which can linger a lifetime.

“That more and more people are trying to share and spread this material shows we should all be doing everything we can to stop this, building more, and innovative solutions to keep children safe.

“The IWF is ready to support technology companies and Ofcom in implementing the Online Safety Act to help make the UK the safest place in the world to be online.”

Scots mum celebrates Online Safety Act becoming law

  • The Online Safety Act became law last week
  • Abuse survivors, young people and bereaved families – including Scottish mum Ruth Moss, join NSPCC Chief Executive and head of Ofcom to mark momentous achievement
  • NSPCC says children and young people must be central to implementation and call for bold and ambitious regulation
  • Pictures available of installation outside Parliament thanking 147,000 campaigners who backed the legislation

Young people, abuse survivors and bereaved families celebrated the Online Safety Act becoming law at an NSPCC reception in Parliament yesterday.

After years of campaigning, legislation that will put a legal duty on tech companies to protect children from sexual abuse and harmful material on social media sites, gaming apps and messaging services was given Royal Assent on Thursday.

Ruth Moss, a member of the Bereaved Families for Online Safety – who were integral in achieving stronger protection for children in the legislation, joined fellow campaigners who have worked tirelessly for new laws that will protect children online, politicians, civil society and regulators to welcome the legislation.

Ruth’s daughter Sophie Parkinson tragically took her own life in March 2014 when she was only 13 years old after she was exposed to harmful content online.

Sophie was self-harming and viewing self-harm and suicide content from the age of 12. She had also had online relationships with older men and looked at violent pornography.

Ruth, a nurse from Dalkeith near Edinburgh, said: “For at least two years, we struggled to keep Sophie safe online.

“In spite of removing devices, restricting internet use, implementing parental controls and having conversations about internet safety, these were not enough to prevent her from being exposed to content that promoted self-harm, suicide and dark, graphic, harmful material. She managed to view violent pornography and have online conversations with adult male strangers.

“Complaining to internet and social media companies was either impossible or futile. As parents, it felt like one step forward and two steps back, especially when balancing her need to use the internet for school and controlling her use. We quickly realised that parents alone can’t control what their child sees on the internet.

“The impact of Sophie viewing this harmful material was a deterioration in her existing mental health struggles, with devastating consequences. We will never truly recover from her death, and it is rightly every parent’s worst nightmare.

“This legislation is a good first step. It sends a message to tech companies that safety should not be compromised for the sake of profit and that tech companies can’t deny responsibility for keeping their service users safe on their websites.

“In my opinion, the enforcement of the bill is key. This will be challenging. It will require Ofcom going up against some of the most powerful and influential organisations in the world. Ofcom will have a difficult job.”

Young people who campaigned with the NSPCC and a mum whose daughter was sexually abused on social media, spoke at the event. They joined the NSPCC Chief Executive Sir Peter Wanless and Ofcom Chief Executive Dame Melanie Dawes who will be responsible for the Act’s implementation.

The event was chaired by Sajid Javid MP who was Home Secretary when regulation to help protect children online was first promised by the Government in 2018, following the launch of the NSPCC’s Wild West Web campaign.

At the reception there was a clear focus on ensuring young people’s voices and experiences are central to the implementation of the Online Safety Act, so it results in meaningful change for children as soon as possible.

The event came as Ofcom prepares to set out the rules which tech companies will have to follow to tackle child sexual abuse and protect children from harmful material.

The codes of practice will be consulted on before being implemented, but those at the event were united in their expectation that tech companies should not wait to begin putting concrete measures in place to make their sites safe by design for children.

Sir Peter Wanless, NSPCC Chief Executive, said: “It was an honour to bring together so many people from different walks of life whose campaigning and dedication has helped make the Online Safety Act possible.

“I want to particularly thank everyone who has been impacted by online abuse and unimaginable harm who have campaigned selflessly to help protect others.

“While we rightly celebrated today, it was also a catalyst towards working together to ensure the legislation results in the protections online children desperately need.

“We look forward to seeing the landmark Act implemented with bold ambition to ensure there is a rigorous focus on children as regulation comes into force.”

Dame Melanie Dawes, Ofcom’s Chief Executive, said: We’re grateful for all the hard work that went into getting these new laws onto the statute books. It means we now have regulation that will make a real difference in creating a safer life online for people – particularly children – in the UK. This is a big job, and we’re ready, but we won’t be doing it alone.

 “Young people’s voices have shaped the foundations, and now we want to hear from them again to make sure we get the technical detail right. Next week, we’ll be consulting on the specific measures that tech firms can take to protect their users from illegal harms online, including child sexual abuse and grooming, and pro-suicide content.”

Technology Secretary Michelle Donelan said: “I am immensely proud of the work that has gone into the Online Safety Act from its very inception to it becoming law.

“At the heart of this Act is the protection of children. I would like to thank the campaigners, parliamentarians, survivors of abuse and charities including the NSPCC, that have worked tirelessly, not only to get this Act over the finishing line, but to ensure that it will make the UK the safest place to be online in the world.”

Many volunteers with lived experience of abuse who have campaigned with the NSPCC for robust legislation also joined the reception.

The NSPCC set up an installation outside the Houses of Parliament thanking the over 147,000 campaigners who backed the legislation. The charity has released a video with young people welcoming the Online Safety Act.

UK children and adults safer online as ‘world-leading’ bill becomes law

Online Safety Act receives Royal Assent putting rules to make the UK the safest place in the world to be online into law

  • Online Safety Act receives Royal Assent in the Houses of Parliament, putting rules to make the UK the safest place in the world to be online into law
  • the Act makes social media companies keep the internet safe for children and give adults more choice over what they see online
  • Ofcom will immediately begin work on tackling illegal content and protecting children’s safety

The Online Safety Act has today (Thursday 26 October) received Royal Assent, heralding a new era of internet safety and choice by placing world-first legal duties on social media platforms.

The new laws take a zero-tolerance approach to protecting children from online harm, while empowering adults with more choices over what they see online. This follows rigorous scrutiny and extensive debate within both the House of Commons and the House of Lords.

The Act places legal responsibility on tech companies to prevent and rapidly remove illegal content, like terrorism and revenge pornography. They will also have to stop children seeing material that is harmful to them such as bullying, content promoting self-harm and eating disorders, and pornography.

If they fail to comply with the rules, they will face significant fines that could reach billions of pounds, and if they don’t take steps required by Ofcom to protect children, their bosses could even face prison.

Technology Secretary Michelle Donelan said: “Today will go down as an historic moment that ensures the online safety of British society not only now, but for decades to come.

“I am immensely proud of the work that has gone into the Online Safety Act from its very inception to it becoming law today. The Bill protects free speech, empowers adults and will ensure that platforms remove illegal content.

“At the heart of this Bill, however, is the protection of children. I would like to thank the campaigners, parliamentarians, survivors of abuse and charities that have worked tirelessly, not only to get this Act over the finishing line, but to ensure that it will make the UK the safest place to be online in the world.”

The Act takes a zero-tolerance approach to protecting children by making sure the buck stops with social media platforms for content they host. It does this by making sure they:

  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content including pornographic content, content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, content depicting or encouraging serious violence or bullying content
  • enforce age limits and use age-checking measures on platforms where content harmful to children is published
  • ensure social media platforms are more transparent about the risks and dangers posed to children on their sites, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

Home Secretary Suella Braverman said: “This landmark law sends a clear message to criminals – whether it’s on our streets, behind closed doors or in far flung corners of the internet, there will be no hiding place for their vile crimes.

“The Online Safety Act’s strongest protections are for children. Social media companies will be held to account for the appalling scale of child sexual abuse occurring on their platforms and our children will be safer.

“We are determined to combat the evil of child sexual exploitation wherever it is found, and this Act is a big step forward.”

Lord Chancellor and Secretary of State for Justice, Alex Chalk said: “No-one should be afraid of what they or their children might see online so our reforms will make the internet a safer place for everyone.

“Trolls who encourage serious self-harm, cyberflash or share intimate images without consent now face the very real prospect of time behind bars, helping protect women and girls who are disproportionately impacted by these cowardly crimes.”

In addition to protecting children, the Act also empowers adults to have better control of what they see online. It provides 3 layers of protection for internet users which will:

  1. make sure illegal content is removed
  2. enforce the promises social media platforms make to users when they sign up, through terms and conditions
  3. offer users the option to filter out content, such as online abuse, that they do not want to see

If social media platforms do not comply with these rules, Ofcom could fine them up to £18 million or 10% of their global annual revenue, whichever is biggest – meaning fines handed down to the biggest platforms could reach billions of pounds.

The government also strengthened provisions to address violence against women and girls. Through the Act, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.

The change in laws also now make it easier to charge abusers who share intimate images and put more offenders behind bars. Criminals found guilty of this base offence will face up to 6 months in prison, but those who threaten to share such images, or shares them with the intent to cause distress, alarm or humiliation, or to obtain sexual gratification, could face up to two years behind bars.

NSPCC Chief Executive, Sir Peter Wanless said: “Having an Online Safety Act on the statute book is a watershed moment and will mean that children up and down the UK are fundamentally safer in their everyday lives.

“Thanks to the incredible campaigning of abuse survivors and young people and the dedicated hard work of Parliamentarians and Ministers, tech companies will be legally compelled to protect children from sexual abuse and avoidable harm.

T”he NSPCC will continue to ensure there is a rigorous focus on children by everyone involved in regulation. Companies should be acting now, because the ultimate penalties for failure will be eye watering fines and, crucially, criminal sanctions.”

Dame Melanie Dawes, Ofcom Chief Executive, said: “These new laws give Ofcom the power to start making a difference in creating a safer life online for children and adults in the UK. We’ve already trained and hired expert teams with experience across the online sector, and today we’re setting out a clear timeline for holding tech firms to account.   

“Ofcom is not a censor, and our new powers are not about taking content down. Our job is to tackle the root causes of harm. We will set new standards online, making sure sites and apps are safer by design. Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression.

“We know a safer life online cannot be achieved overnight; but Ofcom is ready to meet the scale and urgency of the challenge.”

In anticipation of the Bill coming into force, many social media companies have already started making changes. TikTok has implemented stronger age verification on their platforms, while Snapchat has started removing the accounts of underage users.

While the Bill has travelled through Parliament, the government has worked closely with Ofcom to ensure protections will be implemented as quickly as possible once the Act received Royal Assent.

From today, Ofcom will immediately begin work on tackling illegal content, with a consultation process launching on 9th November 2023. They will then take a phased approach to bringing the Online Safety Act into force, prioritising enforcing rules against the most harmful content as soon as possible.

The majority of the Act’s provisions will commence in two months’ time. However, the government has commenced key provisions early to establish Ofcom as the online safety regulator from today and allow them to begin key preparatory work such as consulting as quickly as possible to implement protections for the country.

Rocio Concha, Which? Director of Policy and Advocacy, said:Which? led the campaign for consumers to have stronger protections against scam adverts on social media platforms and search engines that can have devastating financial and emotional consequences for victims.

“These new Online Safety laws are a major step forward in the fight back against fraud by forcing tech firms to step up and take more responsibility for stopping people being targeted by fraudulent online adverts.

“Ofcom must now develop codes of practice that will hold platforms to a high standard and be prepared to take strong enforcement action, including fines, against firms if they break the law.”