Secretary of State Liz Kendal’s statement after concerns over Grok AI

STATEMENT TO PARLIAMENT – 12 JANUARY 2026

With permission Madam Deputy Speaker, I would like to make a statement on AI, social media and online safety.  

No woman or child should live in fear of having their image sexually manipulated by technology.  

Yet in recent days, the Grok AI tool on the social media platform X has been used to create and share degrading, non-consensual intimate deepfakes.     

The content which has circulated on X is vile. It is not just an affront to decent society – it is illegal.   

The Internet Watch Foundation (IWF) reports “criminal imagery” of children as young as 11, including girls sexualised and topless.  

This is Child Sexual Abuse.  

We’ve seen reports of photos being shared of women in bikinis, tied up and gagged, with bruises, covered in blood. And much, much more. 

Lives can and have been devastated by this content, which is designed to harass, torment, and violate people’s dignity.   

They are not harmless images – they are weapons of abuse, disproportionately aimed at women and girls.  

And they are illegal.  

Last week, X limited the image creation function to paid subscribers.  

This does not go anywhere near far enough.  

It is insulting to victims to say you can still have this service if you are willing to pay.

And it is monetising abuse.  

So let me be crystal clear: sharing, or threatening to share, a deepfake intimate image without consent – including images of people in their underwear – is a criminal offence.    

Under the Online Safety Act, sharing images – or threatening to share them – is a criminal offence. For individuals, and for platforms.  

My predecessor – the Right Honourable Member for Hove and Portslade – made this a ‘priority offence’, so services have to take proactive action to stop this content from appearing in the first place.  

The Data Act, passed last year, made it a criminal offence to create – or request the creation of – non-consensual intimate images.  

And today, I can announce to the House that this offence will be brought into force this week and that I will make it a priority offence in the Online Safety Act too.  

This means individuals are committing a criminal offence if they create – or seek to create – such content – including on X – and anyone who does this should expect to face the full extent of the law.   

But the responsibilities do not just lie with individuals for their own behaviour.  

The platforms that host such material must be held accountable – including X.  

Madam Deputy Speaker, Ofcom this morning confirmed that they have opened a formal investigation into X and will assess their compliance with the Online Safety Act.     

The government expects Ofcom to set out a timeline for the investigation as soon as possible.  

The public – and most importantly, the victims of Grok’s activities – expect swift and decisive action. So this must not take months and months.  

But X doesn’t have to wait for the Ofcom investigation to conclude. They can choose to act sooner to ensure this abhorrent and illegal material cannot be shared on their platform.    

If they do not, Ofcom will have the backing of this government to use the full powers which Parliament has given them.  

And I would remind X – and all other platforms – that this includes the power to issue fines worth millions of dollars, or 10% of a company’s qualifying worldwide revenue.   

And in the most serious cases, Ofcom can apply for a court order to stop UK users accessing the site.  

Madam Deputy Speaker, this government will do everything in our power to keep women and especially children safe online.  

So I can today confirm that we will build on all the measures I have already outlined and legislate in the Crime and Policing Bill – which is currently going through Parliament – to criminalise nudification apps.  

This new criminal offence will make it illegal for companies to supply tools designed to create non-consensual intimate images, targeting the problem at its source.      

And in addition to all of these actions, we expect technology companies to introduce the steps recommended by Ofcom’s guidance on how to make platforms safer for women and girls without delay.  

And if they do not, I am prepared to go further.  

Because this government believes tackling violence against women and girls is as important online as it is in the real world.  

Madam Deputy Speaker, this is not – as some would claim – about restricting freedom of speech, something I and the whole government hold very dear.  

It is about tackling violence against women and girls.  

It’s about upholding basic British values of decency and respect, and ensuring the standards we expect offline are upheld online.  

And it is about exercising our sovereign power and responsibility to uphold the laws of the land.  

I hope this is a time when MPs on all sides of the House will stand up for British laws and British values and call out the platforms that allow explicit, degrading and illegal content.   

It is time to choose a side.  

If I may Madam Deputy Speaker, I would also like to address calls from MPs on all sides of this House for the government to end its participation on X.  

I understand why many colleagues have come to this conclusion when X seems so unwilling to clean up its act. The government will of course keep our participation under review.  

But our job is to protect women and girls from illegal and harmful content wherever it is found.  

It is also worth bearing in mind, with 19 million people on X in this country, and more than a quarter using it as their primary source of news, that our views – and often simply the facts – need to be heard.  

Madam Deputy Speaker, let me conclude by saying this.  

AI is a transformative technology which has the potential to bring about extraordinary and welcome change.  

Creating jobs and growth. Diagnosing and treating diseases. Helping children learn at school. Tackling climate change. And so much more besides.  

But in order to seize these opportunities, people must feel confident that they and their children are safe online and that AI is not used for destructive and abusive ends.  

Many tech companies want to and are acting responsibly. But when they do not, we must and we will act.  

Innovation should serve humanity; not degrade it.   

So we will leave no stone unturned in our determination to stamp out these demeaning, degrading and illegal images.   

If that means strengthening the existing laws, we are prepared to do so.   

Because this government stands on the side of decency.  

We stand on the side of the law.   

We stand for basic British values supported by the vast majority of people in this country.  

And I commend this statement to the House.

Ofcom launches investigation into X over Grok sexualised imagery

The UK’s independent online safety watchdog, Ofcom, has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK.

Our initial assessment

There have been deeply concerning reports of the Grok AI chatbot account on X being used to create and share undressed images of people – which may amount to intimate image abuse or pornography – and sexualised images of children that may amount to child sexual abuse material (CSAM).

 As the UK’s independent online safety watchdog, we urgently made contact with X on Monday 5 January and set a firm deadline of Friday 9 January for it to explain what steps it has taken to comply with its duties to protect its users in the UK.

The company responded by the deadline, and we carried out an expedited assessment of available evidence as a matter of urgency.

What our investigation will examine

Ofcom has decided to open a formal investigation to establish whether X has failed to comply with its legal obligations under the Online Safety Act – in particular, to: 

  • assess the risk of people in the UK seeing content that is illegal in the UK, and to carry out an updated risk assessment before making any significant changes to their service;
  • take appropriate steps to prevent people in the UK from seeing ‘priority’ illegal content – including non-consensual intimate images and CSAM;
  • take down illegal content swiftly when they become aware of it;
  • have regard to protecting users from a breach of privacy laws;
  • assess the risk their service poses to UK children, and to carry out an updated risk assessment before making any significant changes to their service; and
  • use highly effective age assurance to protect UK children from seeing pornography.

Ofcom’s role

The legal responsibility is on platforms to decide whether content breaks UK laws, and they can use our Illegal Content Judgements Guidance when making these decisions. Ofcom is not a censor – we do not tell platforms which specific posts or accounts to take down.

Our job is to judge whether sites and apps have taken appropriate steps to protect people in the UK from content that is illegal in the UK, and protect UK children from other content that is harmful to them, such as pornography.

Ofcom’s investigation process

The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations.

Our first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond our findings in full, as required by the Act, before we make our final decision.

Enforcement powers

If our investigation finds that a company has broken the law, we can require platforms to take specific steps to come into compliance or to remedy harm caused by the breach. We can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.

In the most serious cases of ongoing non-compliance, we can make an application to a court for ‘business disruption measures’, through which a court could impose an order, on an interim or full basis, requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK. The court may only impose such orders where appropriate and proportionate to prevent significant harm to individuals in the UK.

UK jurisdiction

In any industry, companies that want to provide a service to people in the UK must comply with UK laws. The UK’s Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see.

There are ways platforms can protect people in the UK without stopping their users elsewhere in the world from continuing to see that content.

An Ofcom spokesperson said: “Reports of Grok being used to create and share illegal non-consensual intimate images and child sexual abuse material on X have been deeply concerning.

“Platforms must protect people in the UK from content that’s illegal in the UK, and we won’t hesitate to investigate where we suspect companies are failing in their duties, especially where there’s a risk of harm to children.

“We’ll progress this investigation as a matter of the highest priority, while ensuring we follow due process. As the UK’s independent online safety enforcement agency, it’s important we make sure our investigations are legally robust and fairly decided.”

Ofcom will provide an update on this investigation as soon as possible.

Keeping children safe online: Changes to the Online Safety Act explained

How new laws that keep children safe on the internet work

Keeping children safe

The way children experience the internet has fundamentally changed, as new laws under the Online Safety Act have come into force to protect under-18s from harmful online content they shouldn’t ever be seeing. This includes content relating to:

  • pornography
  • self-harm
  • suicide
  • eating disorder content

Ofcom figures show that children as young as 8 have accessed pornography online, while 16% of teenagers have seen material that stigmatises body types or promotes disordered eating in the last 4 weeks.   

To protect the next generation from the devastating impact of this content, people now have to prove their age to access pornography or this other harmful material on social media and other sites.    

Platforms are required to use secure methods like facial scans, photo ID and credit cards checks to verify the age of their users. This means it will be much harder for under-18s to accidentally or intentionally access harmful content. 

It’s clear in Ofcom’s codes that we expect platforms to ensure that strangers have no way of messaging children. This includes preventing children from receiving DMs from strangers and children should not be recommended any accounts to connect with.  

Data privacy

While people might see more steps to prove their age when signing up or browsing age-restricted content, they won’t be compromising their privacy.    

The measures platforms have to put in place must confirm your age without collecting or storing personal data, unless absolutely necessary. For example, facial estimation tools can estimate your age from an image without saving that image or identifying who you are. Many third-party solutions have the ability to provide platforms with an answer to the question of whether a user is over 18, without sharing any additional data relating to the user’s identity. 

 The government and the regulator, Ofcom, are clear that platforms must use safe, proportionate and secure methods, and any company that misuses personal data or doesn’t protect users could face heavy penalties.

Services must also comply with the UK’s data protection laws. The Information Commissioner’s Office (ICO) has set out the main data protection principles that services must take into account in the context of age assurance, including minimising personal data which is collected for these purposes.  

Virtual Private Networks

While Virtual Private Networks (VPNs) are legal in the UK, according to this law, platforms have a clear responsibility to prevent children from bypassing safety protections. This includes blocking content that promotes VPNs or other workarounds specifically aimed at young users.   

This means that where platforms deliberately target UK children and promote VPN use, they could face enforcement action, including significant financial penalties.  

The Age Verification Providers Association (AVPA) reports that there has been an additional 5 million age checks on a daily basis as UK-based internet users seek to access sites that are age-restricted.

Online Safety laws do not ban any legal adult content. Instead, the laws protect children from viewing material that causes real harm in the offline world, devastating young lives and families.    

Under the Act, platforms should not arbitrarily block or remove content and instead must take a risk-based, proportionate approach to child safety duties.

Protecting freedom of speech?

As well as legal duties to keep children safe, the very same law places clear and unequivocal duties on platforms to protect freedom of expression. Failure to meet either obligation can lead to severe penalties, including fines of up to 10% of global revenue or £18 million, whichever is greater.

The Act is not designed to censor political debate and does not require platforms to age gate any content other than those which present the most serious risks to children such as pornography or suicide and self-harm content.

Technology Secretary Peter Kyle said: This marks the most significant step forward in child safety since the internet was created.

“The reality is that most children aren’t actively seeking out harmful, dangerous, or pornographic content – unfortunately it finds them. That’s why we’ve taken decisive action.

“Age verification keeps children safe. Rather than looking for ways around it, let’s help make the internet a safer, more positive space for children – and a better experience for everyone. That’s something we should all aspire to.”

Support for the Online Safety Act

NSPCC Chief Executive, Chris Sherwood: “We regularly hear from children who have suffered sexual and emotional abuse online, or who have been exposed to harmful and dangerous content.

“These experiences can have devastating impacts both immediately and long into the future. While the Online Safety Act can’t erase this pain and anger, it can be a vehicle for significant and lasting change.

“Thanks to this piece of ground-breaking regulation, algorithms are now being redesigned. Age checks are now in place. Harmful material that promotes eating disorders and suicide should no longer proliferate on social media platforms.

“This will – without a doubt – create safer, more age-appropriate online experiences for young users across the UK.”

Barnardo’s CEO, Lynne Perry: “These new protections are an important stepping stone towards making sure that children are safer online.

“They must be robustly enforced.”

Internet Matters: “Today marks an important milestone for children’s online safety […] towards ensuring that online services are designed with children’s safety in mind – from limiting children’s exposure to harmful content to creating age-appropriate experiences. 

“This milestone matters because the risks children face online remain high. Our latest Internet Matters Pulse shows that 3 in 4 children aged 9-17 experience harm online, from exposure to violent content to unwanted contact from strangers.

“With the Codes now enforceable, Ofcom must hold platforms accountable for meeting their obligations under the law.”

Deepfakes: What you should know

What parents need to know about Deepfakes

Edinburgh Police Scotland and The City of Edinburgh Council’s Christmas wish is to #KeepXmasSafe for young people whilst online & keep parents & carers more informed.

@Edinburgh_CC

@natonlinesafety

Chief Constable commends charity’s efforts to tackle teen porn use

UK Charity, the Naked Truth Project, have launched new dates for a series of workshops for parents and carers to better understand pornography in a digital age, enabling them to talk to their children and young people about the subject.

Constable Simon Bailey, National Police Chiefs’ Council (NPCC) lead on child protection, has highlighted the need for resources such as ‘The PG Workshops’, following his recognition of porn consumption as a leading factor in the growing cases of sexual violence and abuse amongst teens.

The PG Workshops will be live-streamed online sessions and come following the launch of the ‘Everyone’s Invited’ website, which provides an anonymous forum for children and youth to share experiences of aggressive, or abusive sexual behaviours.

Chief Constable Bailey has previously warned of the links between young people’s access to and consumption of pornography and the kind of sexually aggressive/abusive behaviour highlighted in many of the 15,000+ accounts documented on the website.

Chief Constable Simon Bailey says, “There is a real issue in children’s perception of healthy relationships, healthy sexual relationships, what is permissible and what is acceptable. Unfortunately, I think the ready and easy access to pornography is a driver to that.”

The Chief Constable acknowledges that organisations like Naked Truth Project are doing important work to combat the damaging impacts of porn and urges parents and schools to engage with the issue.

Ch. Constable Bailey continued: “Parents have a responsibility to ensure that children, both sons and daughters, recognise and understand what good values are, what respect and trust and honesty are, and how to treat people.

“The difficult conversations around inevitably viewing pornography need to be had, explaining that it is not a relationship, and schools should be reinforcing this point as well. This is why the kind of schools work and parent workshops that Naked Truth offer can be such a useful resource to this end.”

The workshops delivered hope to provide insight on the pressures young people are facing, and practical tips for parents guiding them through those pressures.

Ian Henderson, Founder and CEO of the Naked Truth Project says: “We believe there is a growing need amongst parents, carers and teachers to talk about and tackle the issue of pornography, especially in light of recent revelations about the scale of sexual violence amongst young people, yet many feel overwhelmed and under-resourced to engage in this conversation.

“We hope our workshops will be an effective way of teaching parents and carers how to talk to their young people about the dangers of pornography, as well as offer some practical tips in setting up parental controls and safety features on devices.”

The swell of the stories shared on ‘Everyone’s Invited’, have highlighted  recent government research, such as the Equalities Office Report, which  acknowledges the links between teenage porn consumption and toxic relationships, harassment in schools and abusive behaviour.

Other national leaders, such as Baroness Benjamin, are also calling for the education of both pupils and parents around the damaging impact of porn on healthy relationships.

Ian Henderson continues: “As an organisation, whilst we recognise that porn use won’t lead to sexual harassment or violence for all individuals, and is certainly not the only contributing factor, it’s vital that we begin to recognise the part that it does play and call it out.

“Given that porn often contains high levels of blatant verbal and physical abuse, as well as the sexualisation of coercion, harassment and outright lack of consent, we must consider the impact this is having on us, and the importance it places on educating our young people well.”

The Naked Truth Project have already delivered these workshops to over 4,000 parents and carers, as well as presenting specialised school lessons which have seen 20,000 students participate.

For more information, or to book a place on the workshops, please visit: 

www.thepgworkshop.com

Children see pornography as young as seven, new report finds

●   Research commissioned by the BBFC shows children and teens are stumbling across  pornorgraphy from an early age

●   Majority of young people’s first time watching pornography was accidental, with over 60% of children 11-13 who had seen pornography saying their viewing of porn is unintentional

●   83% of parents agreed that age-verification controls should be in place for online pornography Continue reading Children see pornography as young as seven, new report finds