First Minister calls for sustainability of STV regional news

Response to Ofcom consultation

The First Minister has expressed deep concerns over proposals to end the STV North tea-time news bulletin.

The Scottish Government’s response to Ofcom’s consultation on STV’s request to change its regional programming commitments strongly urges consideration of the long-term implications this would have on local public service broadcasting in Scotland.

There has been significant public and political pressure to reverse the plans to end the STV North news bulletin.

First Minister John Swinney said that removing public service obligations to deliver regional news would set a ‘damaging precedent’.

Mr Swinney commented: “The Scottish Government strongly believes that these proposals would not achieve the vital aim to ensure audiences are well-served with high-quality news across Scotland.

“Regional news coverage and bulletins are essential for democratic accountability and local representation; maintaining audience trust and engagement and supporting regional journalism and sustaining jobs.

“It is vital that high-quality, independent local bulletins are preserved, editorial centres outside Glasgow are maintained, and Scottish-based jobs and equitable news access across Scotland are safeguarded.

“We strongly urge Ofcom to consider the long-term implications for plurality, local democracy, and the health of Scotland’s media landscape before removing current public service obligations. Weakening these obligations would set a damaging precedent and accelerate the decline of public interest journalism in Scotland.” 

The Scottish Government’s response to Ofcom’s consultation can be found here: 

STV regional programming commitments: Ofcom consultation – gov.scot 

Secretary of State Liz Kendal’s statement after concerns over Grok AI

STATEMENT TO PARLIAMENT – 12 JANUARY 2026

With permission Madam Deputy Speaker, I would like to make a statement on AI, social media and online safety.  

No woman or child should live in fear of having their image sexually manipulated by technology.  

Yet in recent days, the Grok AI tool on the social media platform X has been used to create and share degrading, non-consensual intimate deepfakes.     

The content which has circulated on X is vile. It is not just an affront to decent society – it is illegal.   

The Internet Watch Foundation (IWF) reports “criminal imagery” of children as young as 11, including girls sexualised and topless.  

This is Child Sexual Abuse.  

We’ve seen reports of photos being shared of women in bikinis, tied up and gagged, with bruises, covered in blood. And much, much more. 

Lives can and have been devastated by this content, which is designed to harass, torment, and violate people’s dignity.   

They are not harmless images – they are weapons of abuse, disproportionately aimed at women and girls.  

And they are illegal.  

Last week, X limited the image creation function to paid subscribers.  

This does not go anywhere near far enough.  

It is insulting to victims to say you can still have this service if you are willing to pay.

And it is monetising abuse.  

So let me be crystal clear: sharing, or threatening to share, a deepfake intimate image without consent – including images of people in their underwear – is a criminal offence.    

Under the Online Safety Act, sharing images – or threatening to share them – is a criminal offence. For individuals, and for platforms.  

My predecessor – the Right Honourable Member for Hove and Portslade – made this a ‘priority offence’, so services have to take proactive action to stop this content from appearing in the first place.  

The Data Act, passed last year, made it a criminal offence to create – or request the creation of – non-consensual intimate images.  

And today, I can announce to the House that this offence will be brought into force this week and that I will make it a priority offence in the Online Safety Act too.  

This means individuals are committing a criminal offence if they create – or seek to create – such content – including on X – and anyone who does this should expect to face the full extent of the law.   

But the responsibilities do not just lie with individuals for their own behaviour.  

The platforms that host such material must be held accountable – including X.  

Madam Deputy Speaker, Ofcom this morning confirmed that they have opened a formal investigation into X and will assess their compliance with the Online Safety Act.     

The government expects Ofcom to set out a timeline for the investigation as soon as possible.  

The public – and most importantly, the victims of Grok’s activities – expect swift and decisive action. So this must not take months and months.  

But X doesn’t have to wait for the Ofcom investigation to conclude. They can choose to act sooner to ensure this abhorrent and illegal material cannot be shared on their platform.    

If they do not, Ofcom will have the backing of this government to use the full powers which Parliament has given them.  

And I would remind X – and all other platforms – that this includes the power to issue fines worth millions of dollars, or 10% of a company’s qualifying worldwide revenue.   

And in the most serious cases, Ofcom can apply for a court order to stop UK users accessing the site.  

Madam Deputy Speaker, this government will do everything in our power to keep women and especially children safe online.  

So I can today confirm that we will build on all the measures I have already outlined and legislate in the Crime and Policing Bill – which is currently going through Parliament – to criminalise nudification apps.  

This new criminal offence will make it illegal for companies to supply tools designed to create non-consensual intimate images, targeting the problem at its source.      

And in addition to all of these actions, we expect technology companies to introduce the steps recommended by Ofcom’s guidance on how to make platforms safer for women and girls without delay.  

And if they do not, I am prepared to go further.  

Because this government believes tackling violence against women and girls is as important online as it is in the real world.  

Madam Deputy Speaker, this is not – as some would claim – about restricting freedom of speech, something I and the whole government hold very dear.  

It is about tackling violence against women and girls.  

It’s about upholding basic British values of decency and respect, and ensuring the standards we expect offline are upheld online.  

And it is about exercising our sovereign power and responsibility to uphold the laws of the land.  

I hope this is a time when MPs on all sides of the House will stand up for British laws and British values and call out the platforms that allow explicit, degrading and illegal content.   

It is time to choose a side.  

If I may Madam Deputy Speaker, I would also like to address calls from MPs on all sides of this House for the government to end its participation on X.  

I understand why many colleagues have come to this conclusion when X seems so unwilling to clean up its act. The government will of course keep our participation under review.  

But our job is to protect women and girls from illegal and harmful content wherever it is found.  

It is also worth bearing in mind, with 19 million people on X in this country, and more than a quarter using it as their primary source of news, that our views – and often simply the facts – need to be heard.  

Madam Deputy Speaker, let me conclude by saying this.  

AI is a transformative technology which has the potential to bring about extraordinary and welcome change.  

Creating jobs and growth. Diagnosing and treating diseases. Helping children learn at school. Tackling climate change. And so much more besides.  

But in order to seize these opportunities, people must feel confident that they and their children are safe online and that AI is not used for destructive and abusive ends.  

Many tech companies want to and are acting responsibly. But when they do not, we must and we will act.  

Innovation should serve humanity; not degrade it.   

So we will leave no stone unturned in our determination to stamp out these demeaning, degrading and illegal images.   

If that means strengthening the existing laws, we are prepared to do so.   

Because this government stands on the side of decency.  

We stand on the side of the law.   

We stand for basic British values supported by the vast majority of people in this country.  

And I commend this statement to the House.

Ofcom launches investigation into X over Grok sexualised imagery

The UK’s independent online safety watchdog, Ofcom, has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK.

Our initial assessment

There have been deeply concerning reports of the Grok AI chatbot account on X being used to create and share undressed images of people – which may amount to intimate image abuse or pornography – and sexualised images of children that may amount to child sexual abuse material (CSAM).

 As the UK’s independent online safety watchdog, we urgently made contact with X on Monday 5 January and set a firm deadline of Friday 9 January for it to explain what steps it has taken to comply with its duties to protect its users in the UK.

The company responded by the deadline, and we carried out an expedited assessment of available evidence as a matter of urgency.

What our investigation will examine

Ofcom has decided to open a formal investigation to establish whether X has failed to comply with its legal obligations under the Online Safety Act – in particular, to: 

  • assess the risk of people in the UK seeing content that is illegal in the UK, and to carry out an updated risk assessment before making any significant changes to their service;
  • take appropriate steps to prevent people in the UK from seeing ‘priority’ illegal content – including non-consensual intimate images and CSAM;
  • take down illegal content swiftly when they become aware of it;
  • have regard to protecting users from a breach of privacy laws;
  • assess the risk their service poses to UK children, and to carry out an updated risk assessment before making any significant changes to their service; and
  • use highly effective age assurance to protect UK children from seeing pornography.

Ofcom’s role

The legal responsibility is on platforms to decide whether content breaks UK laws, and they can use our Illegal Content Judgements Guidance when making these decisions. Ofcom is not a censor – we do not tell platforms which specific posts or accounts to take down.

Our job is to judge whether sites and apps have taken appropriate steps to protect people in the UK from content that is illegal in the UK, and protect UK children from other content that is harmful to them, such as pornography.

Ofcom’s investigation process

The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations.

Our first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond our findings in full, as required by the Act, before we make our final decision.

Enforcement powers

If our investigation finds that a company has broken the law, we can require platforms to take specific steps to come into compliance or to remedy harm caused by the breach. We can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.

In the most serious cases of ongoing non-compliance, we can make an application to a court for ‘business disruption measures’, through which a court could impose an order, on an interim or full basis, requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK. The court may only impose such orders where appropriate and proportionate to prevent significant harm to individuals in the UK.

UK jurisdiction

In any industry, companies that want to provide a service to people in the UK must comply with UK laws. The UK’s Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see.

There are ways platforms can protect people in the UK without stopping their users elsewhere in the world from continuing to see that content.

An Ofcom spokesperson said: “Reports of Grok being used to create and share illegal non-consensual intimate images and child sexual abuse material on X have been deeply concerning.

“Platforms must protect people in the UK from content that’s illegal in the UK, and we won’t hesitate to investigate where we suspect companies are failing in their duties, especially where there’s a risk of harm to children.

“We’ll progress this investigation as a matter of the highest priority, while ensuring we follow due process. As the UK’s independent online safety enforcement agency, it’s important we make sure our investigations are legally robust and fairly decided.”

Ofcom will provide an update on this investigation as soon as possible.

NO EXCUSES: Statement on xAI’s Grok image generation and editing tool

Technology Secretary Liz Kendall calls for swift action after reports xAI’s Grok tool continues to allow generation of intimate deepfake images

The Technology Secretary has commented on the changes xAI has implemented to its chatbot overnight, and government action to stamp out this form of abuse.

Technology Secretary Liz Kendall said: “Sexually manipulating images of women and children is despicable and abhorrent. It is an insult and totally unacceptable for Grok to still allow this if you’re willing to pay for it. I expect Ofcom to use the full legal powers Parliament has given them.

“I, and more importantly the public – would expect to see Ofcom update on next steps in days not weeks.

“I would remind xAI that the Online Safety Act includes the power to block services from being accessed in the UK, if they refuse to comply with UK law. If Ofcom decide to use those powers they will have our full support.

“We will be banning nudification apps in the Crime and Policing Bill which is in parliament now.

“We are in the coming weeks bringing in to force powers to criminalise the creation of intimate images without consent.

“I expect all platforms to abide by Ofcom’s new Violence Against Women and Girls (VAWG) guidance and if they do not, I am prepared to go further.

“We are as determined to ensure women and girls are safe online as we are to ensure they are safe in the real world. No excuses.”

From apps to AI search: how the UK goes online in 2025

OFCOM’s latest Online Nation report explores how adults and children in the UK experience life online. From the sites and apps we use every day, to how people feel about what they do and what they encounter online.

We’re spending even more time online as a nation

Adults now spend an average of four and a half hours online a day – up by 10 minutes on last year. Women spend 26 minutes a day longer online than men, with a daily average of 4hrs 43 mins.

Most of time online is spent on a smartphone, where adults use an average of 41 apps a month. WhatsApp, Facebook and Google Maps are the three most commonly used apps among adults.

Half of all time online is now spent on Alphabet and Meta-owned services

Two major tech firms now account for more than half of the time people in the UK spend online.

YouTube is the most used Alphabet-owned service, used by 94% of adults. Time on YouTube is increasing, reaching an average of 51 minutes a day, not including the TV set. The combination of Facebook and Messenger (93% adults) is the most widely used Meta service, followed by WhatsApp (90% adults).

AI is shaking up search

Google Search is used by four in five (82%) adults. It is by far the most used search service in the UK, with 3 billion searches a month.

AI is changing the UK’s search experience. About 30% of searches now show AI overviews, and more than half (53%) of adults say they see these summaries often. In most cases, they aren’t seeking these but finding them now included by their search services.

Generative AI services are gaining traction, with more people actively seeking them out. ChatGPT had 1.8 billion UK visits in the first eight months of 2025, up from 368 million in same period of 2024.

Adults are less positive about the impact of the internet

This year, only a third of adults (33%) said they feel the internet is good for society – down from 40% last year. And while nearly two-thirds (65%) of adults believe the personal benefits of being online outweigh the risks, this figure has declined steadily from 71% two years ago.

Fewer adults feel freer to be themselves online than offline this year (25%, down from 30% last year), and only 35% feel they can share opinions more easily online than offline.

What the UK’s children are doing online: social media, schoolwork and spending regrets

Younger Gen Z and the eldest Gen Alphas are mobile-first, video-native internet users. Children aged 8–14 spend almost 3 hours online daily, rising to 4 hours for 13–14-year-olds and about two hours for 8–9-year-olds. This only counts time on smartphones, tablets, laptops and computers – not games consoles.

YouTube and Snapchat lead the way when it comes to screen time. Across 8–14s, children spend about 48 minutes a day on YouTube and 45 minutes on Snapchat – together making up around half of their total online time. Almost all 8-14-year-olds use YouTube (96%) and Google Search (95%). WhatsApp (63%) and TikTok (58%) also rank highly.

Late-night scrolling is common. Across four of the main services used by children – YouTube, Snapchat, TikTok and WhatsApp – 15-24% of the time spent for the whole 8-14 age range happens between 9pm and 5am. 4–10% of usage happens after 11pm, depending on the platform.

Most children are happy with their online lives

Overall, nine in ten (91%) children aged 8-17 say they are happy with the things that they do online.

Teenagers use social media and messaging apps to stay connected. Almost three-quarters (72%) of 13-17s who use these platforms say they help them feel closer to friends. Girls aged 13-17 are more likely than boys of the same age to see being online as good for helping to build and maintain friendships (71% vs 60%).

Overall, seven in ten (69%) 13–17-year-olds go online to support their wellbeing, mainly to relax (45%) or lift their mood (32%). Nearly eight in ten (78%) say the internet helps with schoolwork, and more than half (55%) use it to learn new skills.

But they’re mindful of doomscrolling – and ‘brain-rot’

Some of the children we spoke to reflected on the negative impacts of spending too long scrolling on their smartphone. They used the term “brain rot” to describe both the type of content and the feeling it leaves behind. This content is fast-paced, chaotic, and often nonsensical and can leave viewers overstimulated and disoriented.

Reclaiming their online space – Gen Z are more likely to act on harmful content

While we found that seven in ten 11-17 year olds had seen or heard harmful content online in the last four weeks, we also found that nearly two-thirds (64%) of them had taken action after encountering such content.

Actions included using platform tools like the ‘dislike’ button (15%), reporting content (11%), blocking the person who posted the content (10%), or telling an adult (10%).

Importantly, we spoke to children about this before our Protection of Children Codes of Practice came into force in July 2025. Under our new rules, sites and apps must take steps to prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. These steps include age checks and ensuring this content doesn’t appear in children’s ‘for you’ feeds, and they must also have improved reporting tools for children to use. They must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

Online retail remorse – younger users regretting their online purchases

Almost six in ten (58%) children aged 8-17 said they had spent money online in the past month, whether on social media sites, video-sharing platforms, or while they were gaming.

Children tell us that they were encouraged to spend money in various ways online, including character customisation (30%), adverts (27%), recommendations from friends or family (23%) and influencer content (22%).

But a third of children (32%) regretted the purchases they’d made in-game, and 43% regretted purchases made on social media. While 42% were unclear on what they even were buying in games.

Keeping children safe online: Changes to the Online Safety Act explained

How new laws that keep children safe on the internet work

Keeping children safe

The way children experience the internet has fundamentally changed, as new laws under the Online Safety Act have come into force to protect under-18s from harmful online content they shouldn’t ever be seeing. This includes content relating to:

  • pornography
  • self-harm
  • suicide
  • eating disorder content

Ofcom figures show that children as young as 8 have accessed pornography online, while 16% of teenagers have seen material that stigmatises body types or promotes disordered eating in the last 4 weeks.   

To protect the next generation from the devastating impact of this content, people now have to prove their age to access pornography or this other harmful material on social media and other sites.    

Platforms are required to use secure methods like facial scans, photo ID and credit cards checks to verify the age of their users. This means it will be much harder for under-18s to accidentally or intentionally access harmful content. 

It’s clear in Ofcom’s codes that we expect platforms to ensure that strangers have no way of messaging children. This includes preventing children from receiving DMs from strangers and children should not be recommended any accounts to connect with.  

Data privacy

While people might see more steps to prove their age when signing up or browsing age-restricted content, they won’t be compromising their privacy.    

The measures platforms have to put in place must confirm your age without collecting or storing personal data, unless absolutely necessary. For example, facial estimation tools can estimate your age from an image without saving that image or identifying who you are. Many third-party solutions have the ability to provide platforms with an answer to the question of whether a user is over 18, without sharing any additional data relating to the user’s identity. 

 The government and the regulator, Ofcom, are clear that platforms must use safe, proportionate and secure methods, and any company that misuses personal data or doesn’t protect users could face heavy penalties.

Services must also comply with the UK’s data protection laws. The Information Commissioner’s Office (ICO) has set out the main data protection principles that services must take into account in the context of age assurance, including minimising personal data which is collected for these purposes.  

Virtual Private Networks

While Virtual Private Networks (VPNs) are legal in the UK, according to this law, platforms have a clear responsibility to prevent children from bypassing safety protections. This includes blocking content that promotes VPNs or other workarounds specifically aimed at young users.   

This means that where platforms deliberately target UK children and promote VPN use, they could face enforcement action, including significant financial penalties.  

The Age Verification Providers Association (AVPA) reports that there has been an additional 5 million age checks on a daily basis as UK-based internet users seek to access sites that are age-restricted.

Online Safety laws do not ban any legal adult content. Instead, the laws protect children from viewing material that causes real harm in the offline world, devastating young lives and families.    

Under the Act, platforms should not arbitrarily block or remove content and instead must take a risk-based, proportionate approach to child safety duties.

Protecting freedom of speech?

As well as legal duties to keep children safe, the very same law places clear and unequivocal duties on platforms to protect freedom of expression. Failure to meet either obligation can lead to severe penalties, including fines of up to 10% of global revenue or £18 million, whichever is greater.

The Act is not designed to censor political debate and does not require platforms to age gate any content other than those which present the most serious risks to children such as pornography or suicide and self-harm content.

Technology Secretary Peter Kyle said: This marks the most significant step forward in child safety since the internet was created.

“The reality is that most children aren’t actively seeking out harmful, dangerous, or pornographic content – unfortunately it finds them. That’s why we’ve taken decisive action.

“Age verification keeps children safe. Rather than looking for ways around it, let’s help make the internet a safer, more positive space for children – and a better experience for everyone. That’s something we should all aspire to.”

Support for the Online Safety Act

NSPCC Chief Executive, Chris Sherwood: “We regularly hear from children who have suffered sexual and emotional abuse online, or who have been exposed to harmful and dangerous content.

“These experiences can have devastating impacts both immediately and long into the future. While the Online Safety Act can’t erase this pain and anger, it can be a vehicle for significant and lasting change.

“Thanks to this piece of ground-breaking regulation, algorithms are now being redesigned. Age checks are now in place. Harmful material that promotes eating disorders and suicide should no longer proliferate on social media platforms.

“This will – without a doubt – create safer, more age-appropriate online experiences for young users across the UK.”

Barnardo’s CEO, Lynne Perry: “These new protections are an important stepping stone towards making sure that children are safer online.

“They must be robustly enforced.”

Internet Matters: “Today marks an important milestone for children’s online safety […] towards ensuring that online services are designed with children’s safety in mind – from limiting children’s exposure to harmful content to creating age-appropriate experiences. 

“This milestone matters because the risks children face online remain high. Our latest Internet Matters Pulse shows that 3 in 4 children aged 9-17 experience harm online, from exposure to violent content to unwanted contact from strangers.

“With the Codes now enforceable, Ofcom must hold platforms accountable for meeting their obligations under the law.”

Tuning into YouTube: UK’s media habits revealed

  • Gen Alpha turn to YouTube first on their TV set at home, while over 55s double their time on the service
  • Fewer than half of 16-24-year-olds watch broadcast TV weekly
  • Despite declines, traditional broadcasters’ content still makes up majority of in-home viewing

YouTube is leading the charge in the streaming takeover of TV sets, with the service now the first place younger viewers go as soon as they switch on, according to Ofcom’s annual report on the nation’s media habits.

Overall people spent an average of 4 hours 30 minutes per day watching TV and video content at home in 2024. And while broadcast TV still accounts for the majority of in-home viewing (56%), audiences are increasingly turning to YouTube. The platform is now the second most-watched service in the UK, behind the BBC and ahead of ITV.

At home, people spent 39 minutes on YouTube per day in 2024, with 16 minutes of this via the household’s TV set. Younger adults aged 16-34 are driving this trend, watching 18 minutes of YouTube a day on TV, while one in five (20%) children aged 4-15 head straight to the app as soon as they turn the set on.

But it’s not just Gen Z and Alpha driving this trend. Over 55s are now watching nearly double the amount of YouTube content on their TVs compared to the previous year (11 minutes per day in December 2024, up from just 6 minutes in January 2023).  Last year, 42% of all YouTube viewing by this age group was on a TV set (up from 33% in 2023).

YouTube content evolving

The content audiences are watching on YouTube has evolved too. Half of the platform’s top-trending videos now more closely resemble traditional TV, including long-form interviews and game shows. This shift positions YouTube as a direct competitor to ad-supported TV services, while offering broadcasters a way to reach wider and younger audiences.  

Some broadcasters are increasingly offering  their own programmes on YouTube, for example ITV and Channel 4 make full length programming available on their channels, retaining control over adverts. Ofcom has identified these sorts of partnerships, making public service content available and prominent on online platforms, as critical to sustain the future of public service media in its recent report, Transmission Critical.

Public service broadcasters (PSBs) are seeing success with their online services, especially the BBC. For the first time, people are watching more online programmes from broadcasters than they are recorded programmes.

Ed Leighton, Ofcom’s Interim Group Director for Strategy and Research, said: Scheduled TV is increasingly alien to younger viewers, with YouTube the first port of call for many when they pick up the TV remote. But we’re also seeing signs that older adults are turning to the platform as part of their daily media diet too.

“Public service broadcasters are recognising this shift – moving to meet audiences in the online spaces where they increasingly spend their time. But we need to see even more ambition in this respect to ensure that public service media that audiences value survives long into the future.”

Generational divide

Overall, people spent 4% less time watching broadcast TV in 2024 than the previous year, with average viewing dropping to 2 hours 24 minutes a day on TV sets. This trend was particularly driven by young adults (16-24), who watched just 17 minutes of live TV daily. Only 45% of this age group tuned into broadcast TV weekly, down from 48% in 2023.

Less than a quarter of 16-24s’ in-home video viewing is now to broadcaster content, versus 90% for those aged 75 and over.

Overall, people watched content from video-on-demand platforms for an average of 40 minutes per day. Netflix continues to be the most popular service, watched for an average of 22 minutes per day, and accounting for more than half of all viewing on streaming platforms.

Festive favourites top the list of most-watched moments

But broadcasters proved they can still bring the nation together for shared major TV moments, with the BBC and ITV boasting the top three most-watched shows of 2024.  

Gavin and Stacy: The Finale (18.6 million) was the most watched programme last year, followed by Wallace and Gromit: Vengeance Most Fowl (16.9 million), with the fourth episode of Mr Bates vs The Post Office (14.7 million) coming in third. The top two most-watched programmes both aired first on Christmas day.

The Spain v England Euro 2024 final was the most-watched live sports event of the year across the BBC, ITV and STV, with 19.8 million people tuning in on the day.

Netflix’s Adolescence was the most-watched TV event in the first quarter of 2025 with 12.2 million viewers until the end of March. This marked the first time a streaming title topped weekly TV ratings [2]. 

Picture2Podcasts eat up audio diets

Our Media Nations Report also has its ear to the ground on how the nation’s listening habits are evolving. More than nine in ten UK adults (93%) listen to some form of audio content each week, increasing to 98% of 16-34-year-olds. YouTube (47%) and Spotify (36%) are the most popular online audio services, while BBC Sounds is the most popular from a radio broadcaster (24%).

Music streaming and podcasts continue to be an important part of our audio diets, particularly for younger people. People aged 15-34 now spend more than half of their weekly listening time with streamed music and podcasts (58%, up from 40% in 2019), which is close to double the amount for the average listener (30%) [3].

Podcasts are also increasingly available in video as well as audio form. Platforms such as YouTube, Spotify, and Global Player now regularly host video versions of UK podcasts, helping creators engage with broader audiences.

More top trends from our Media Nations reports are available on our news centre.

Ofcom: Reforming the postal service so it delivers what people need

  • Ofcom sets Royal Mail new backstop delivery targets to protect people from long delays
  • Changes made to Second Class letter deliveries to protect the universal service
  • Ofcom to review affordability of post amid concerns over stamp prices

UK postal users will have extra protections against long delivery delays, under reforms to the universal service announced today by Ofcom, which will enable Royal Mail to improve reliability and support a sustainable service.

Why reform is needed

Since 2011, Royal Mail has been required under the universal service obligation to deliver First and Second Class letters six days a week. But in that time, the number of letters sent each year has more than halved. With fewer letters being delivered to each house on a given round, the cost of delivering each letter has increased, and Royal Mail has lost hundreds of millions of pounds in recent years.[1]

Urgent reform is needed for the universal service to survive. To put the service on a more sustainable footing, to prevent people from paying higher prices than necessary, and to push Royal Mail to improve reliability, Ofcom has today made changes to the obligations imposed on the company.

This follows public consultation with thousands of people and organisations – including consumer groups, unions, small businesses, public services, Royal Mail and the wider postal industry, as well as postal users directly – from right across the UK.

We have also launched a review of pricing and affordability, which will consider concerns that many people and organisations have raised about stamp prices. We plan to consult on this next year.

Natalie Black, Ofcom’s Group Director for Networks and Communications, said: “These changes are in the best interests of consumers and businesses, as urgent reform of the postal service is necessary to give it the best chance of survival.

“But changing Royal Mail’s obligations alone won’t guarantee a better service – the company now has to play its part and implement this effectively. We’ll be making sure Royal Mail is clear with its customers about what’s happening, and passes the benefits of these changes on to them.

“As part of this process, we’ve been listening to concerns about increases in stamp prices. So we’ve launched a review of affordability and plan to publicly consult on this next year.”

What reform will deliver

Our research suggests that affordability and reliability are more important to people than speed of delivery, but they value having a next-day service available for when they need to send the occasional urgent item. Royal Mail will therefore continue to be required to deliver First Class letters the next working day, Monday to Saturday, and there will continue to be a cap on the price of a Second Class stamp.

However, people have told us that most letters are not urgent, and they do not need six days a week delivery for the majority of letters. So, from 28 July, we will allow Royal Mail to deliver Second Class letters on alternate weekdays – still within three working days of collection – Monday to Friday.[2]

We estimate Royal Mail could realise annual net cost savings of between £250m and £425m with successful implementation of this change, enabling it to invest more in improving its delivery performance. We have told Royal Mail to hold regular meetings with consumer bodies and industry groups to hear about the experiences of people and businesses as it implements these changes.[3]

Our research also shows that small reductions in Royal Mail’s delivery targets would continue to meet people’s needs. Maintaining the current targets – which are more stretching than comparable European countries – would carry higher costs which would need to be recovered through higher prices.

So, we are making small changes to Royal Mail’s existing delivery targets – for First Class mail from 93% to 90% delivered next-day, and for Second Class mail from 98.5% to 95% delivered within three days. These new targets are high by international standards.[4]

However, many people have experienced long delays where letters have taken weeks to arrive.

To address this issue, we have set Royal Mail new enforceable backstop targets so that 99% of mail has to be delivered no more than two days late.

MPs to investigate children’s TV and video content

A new inquiry will explore the provision of children’s TV and video content in the UK and what can be done to ensure future generations continue to have access to high-quality British-made programming. 

Research from Ofcom shows a structural shift in the viewing habits of young people, with television viewing by children dropping and YouTube now the most used app or site by children of all ages, with 88% of 3 to 17-year-olds using it last year.

The changing ways in which audience consume TV and video, has made it more challenging for public service broadcasters to make original TV content for children and for it to be found. This has a knock-on effect for those in our creative industries who want to make quality UK TV and video for children.  

The Culture, Media and Sport Committee inquiry will therefore examine how to ensure those making original high-quality content can continue and how it can be made easier to find it online. 

It will also explore issues relating to parental control of online content, the potential positive and negative effects of how children watch TV and video content on their health and development, and wider issues relating to the sector’s contribution to the economy and its importance to the UK’s cultural identity. 

Chair of the CMS Committee, Dame Caroline Dinenage MP, said: “Children’s viewing habits have come a long way, but whether they watch through a smart TV or a tablet, there is still demand for good quality TV and video for children.

“We all want young people to have access to a range of programming, so in addition to cartoons, they also see drama and factual programmes. We want them to be able to be educated and inspired, as well as entertained. 

“Changes to the media landscape, particularly the shift in viewing to YouTube, pose huge challenges for the future of children’s programming and the continued production of original content by our public service broadcasters.

“We want to know what prominence means for programmes made for children in the future world of smart TVs, streaming, video sharing platforms and endless choice.   

“We have a proud history of high-quality children’s television in the UK. Our inquiry will be showcasing the contribution the sector makes to both our culture and economy and how we can best ensure that content designed for children in all its forms continues to both educate and entertain.”

Terms of reference 

The Committee is inviting written submissions in response to the following questions: 

Children’s TV and video content in the UK 

  1. Who is commissioning and making original, high-quality, TV and video content for children and young audiences in the UK?
    1.  How can they be best supported to continue to make more?
  2. How does the range of content and genres for children vary between that provided by public service media, subscription channels, and both short- and long-form video sharing platforms?
    1. Which audiences, by age or other characteristic, are currently being underserved?
    2.  How can we increase the amount of news and factual programming made for children on TV and online?

Finding children’s TV and video content online 

  1. How can it be made easier to find original, high-quality, TV and video content for children online?
    •  How can the attribution of public service children’s content on video sharing platforms be improved?
  2. How effective are the tools available for parents to control what children are watching on public service media, subscription channels, video sharing platforms? 

Health and child development 

  1. What evidence is there that the TV and video content that children watch, and how they watch it, can contribute:
    • Positively to their health, learning and development?
    •  Negatively to their health, learning and development?

Wider benefits of children’s TV 

  1. How does children’s TV made in the UK contribute to:
    • The UK’s culture and identity?
    • Our cultural and economic exports?

Thousands of online grooming crimes in Scotland during past five years

  • More than 3,000 Communicating Indecently with a Child offences have been recorded by Police Scotland during the past five years
  • NSPCC urges Ofcom to significantly strengthen its approach to child sexual abuse and for the UK Government to ensure the regulator can tackle grooming in private messaging

Over 3,000 online grooming crimes across Scotland have been recorded by Police Scotland during the past five years, new data compiled by the NSPCC has revealed.   

The figures provided by Police Scotland show 3,234 Communicating Indecently with a Child offences were recorded since 2019, with 672 offences recorded last year (2023/24) – an increase of 13% from the previous year.  

The NSPCC has issued these findings a year on from the Online Safety Act being passed.

The charity is urging Ofcom to significantly strengthen the rules social media platforms must follow to tackle child sexual abuse on their products.

They say the regulator currently puts too much focus on acting after harm has taken place rather than being proactive to ensure the design features of social media apps are not contributing to abuse.

The NSPCC is also calling on the Government to strengthen legislation to ensure child sexual abuse is disrupted in private messages such as on Snapchat and WhatsApp.

The charity’s Voice of Online Youth young people’s group were not surprised at the prevalence of Snapchat in offences.

Liidia, 13 from Glasgow, said: “Snapchat has disappearing messages, and that makes it easier for people to hide things they shouldn’t be doing.

“Another problem is that Snapchat has this feature where you can show your location to everyone. If you’re not careful, you might end up showing where you are to people you don’t know, which is super risky.

“And honestly, not all the rules in Snapchat are strict, so some people take advantage of that to do bad things. Apps should have better ways for us to report bad things, and they should always get updated to protect us better with the latest security tech.”

Sir Peter Wanless, NSPCC Chief Executive, said: “One year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children.

“We need ambitious regulation by Ofcom who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.

“It is clear that much of this abuse is taking place in private messaging which is why we also need the UK Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.”

National Police Chief’s Council Lead for Child Protection and Abuse Investigations (CPAI) Becky Riggs said: “The numbers in this NSPCC data are shocking and policing joins partners in urging tech companies and Ofcom to fulfil their legal and moral obligations to keep children safe from harm within the online communities they have created.

“A year on from the Online Safety Act being passed, it is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow.

“Policing will not stop in its fight against those who commit these horrific crimes. We cannot do this alone, so while we continue to pursue and prosecute those who abuse and exploit children, we repeat our call for more to be done by companies in this space.”