Could Ofcom ban social media for under-18s?

  • Published

A quick guide to Ofcom's social media rules

People under 18 could be banned from using social media apps including TikTok, Instagram, and Snapchat if the tech firms don’t follow new Ofcom rules to protect children.

But what is Ofcom and why does it want to change what people see online? Here’s a quick guide to get you up to speed.

Why does Ofcom want to ban social media for under 18s?

Ofcom’s draft guidelines follow the government’s Online Safety Act, which aims to make technology companies take more responsibility for children’s safety online.

Social media users have to be over 13, but Ofcom says children regularly see content promoting self-harm, pornography or violence on social media. This is banned by the biggest social networks like TikTok and Instagram, but increasing number of children have been hurt or died after seeing harmful content online in recent years.

If tech firms don’t change what children see, Ofcom says it will ban children under 18 from using social media altogether.

What does Ofcom stand for?

Ofcom is short for the Office of Communications. It makes sure companies and businesses such as TV channels, radio stations, postal services, social media apps and online platforms are treating people fairly, legally and safely.

What has Ofcom said about social media?

Ofcom has published more than 40 guidelines it says sites and apps should follow to keep children safe online.

These include bringing in new ways to check how old users are, changing what young users see, removing harmful content more effectively, and helping users report it.

"Young people are fed harmful content on their feed again and again and this has become normalised but it needs to change,” said Ofcom boss Dame Melanie Dawes, taking aim at “toxic” social media algorithms.

What does a social media algorithm do?

A young boy holds a smartphone loading tiktok

People post billions of messages, pictures and videos on social media every day, so companies like TikTok and Instagram use software called an algorithm to decide what users see.

It recommends content based on users’ past behaviour on the app, their age and gender, or other information.

Could the ban really happen?

Ofcom says the new rules will be a reality in the second half of 2025. But experts believe it will be very difficult to make sure both users and social media companies are following the rules, and some people are concerned the new rules could affect users’ privacy or limit free speech.

There are also questions about how verifying someone’s age will work, such as whether users will need to give their photo ID.

What have social media sites said?

A close up of a smartphone screen showing the buttons for various social media apps

Snapchat and Meta, which owns Facebook, Instagram and WhatsApp, have both released statements saying they had extra protections for users who are under 18 and help parents control what their children see.

But many companies have not responded to the BBC’s requests for comment.

What have campaigners said?

The families of 12 children whose deaths have been linked to harmful online content spoke to the BBC after Ofcom published the rules.

They said the rules didn’t go far enough to protect children and change happened too slowly, and another group of parents complained that Ofcom did not do enough to listen to parents.

More than anything, these campaigners are worried that more children will die if social media companies and the government don’t do more.