What the Online Safety Act is - and how to keep children safe online

  • Published
A stock photo of a girl and her dog looking at a phone on a couchImage source, Getty Images

Technology companies will have to take more action to keep children in the UK safe on the internet, following the introduction of the Online Safety Act.

The new rules come in during 2025 - but critics say they do not go far enough.

How much time do UK children spend online?

Children aged eight to 17 spend between two and five hours online per day, research by the communications regulator Ofcom suggests, external. Time spent online increases with age.

Nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok.

Four in five teenagers who go online say they have used artificial intelligence (AI) tools such as ChatGPT or Snapchat's MyAI.

About half of children over 12 think being online is good for their mental health, according to Ofcom, external.

But one in eight children aged eight to 17 have said someone had been nasty or hurtful to them on social media, or messaging apps.

The Children's Commissioner said that half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites.

What online parental controls are available?

Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters -a safety organisation set up by some of the big UK-based internet companies.

It has a list of parental controls available and step-by-step guides, external on how to use them.

On YouTube - the most popular platform for young people in the UK - parents who want to try to prevent their children seeing unsuitable material can set up the "kids" version, which filters out adult content.

Parents can also set up supervised accounts, external to review the content that older children using YouTube's main site can find and watch.

Supervision can also be set up on Facebook Messenger, via its Family Centre, external.

Its parent company Meta provides parental controls across its social media apps such as daily time limits, scheduled break times and information the content their child interacts with.

Instagram, also owned by Meta, has introduced teen accounts for under-18s whichsets them to private by default.

Infographic showing how some teens will be prompted to add a parent if they try to change default settings on teen accountsImage source, Instagram
Image caption,

Instagram does not let 13 to 15 year old users make their account public unless they add a parent or guardian to their Teen Account.

Similarly, under-13s on Roblox must get parental consent in order to have private, in-game conversations.

TikTok says its family pairing, external tool lets parents decide whether to make a teenager's account private.

But such controls are not fool-proof. Ofcom data has suggested that about one in 20 children uses workarounds.

What controls are there on mobile phones and consoles?

Phone networks may block some explicit websites until a user has demonstrated they are over 18.

Some also have parental controls that can limit the websites children can visit on their phones.

Android and Apple phones and tablets also have controls for parents.

Apple's Screen Time tool for iOS devices shown on a screenImage source, Getty Images

These can block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.

Apple has Child Accounts, external and Google has Family Link, external. There are similar apps available from third-party developers.

Later in 2025, Apple says it will let parents share their child's age range - rather than their date of birth - linked to their child's account with app developers, to help them provide age-appropriate experiences.

Broadband services also have parental controls to filter certain types of content.

Game console controls also let parents ensure age-appropriate gaming and control in-game purchases, external.

How should you talk to your children about online safety?

Talking to children about online safety and taking an interest in what they do online is important, external, according to the NSPCC.

It recommends making discussions about it part of daily conversation, just like a chat about their day at school, which can make it easier for children to share any concerns they have.

What are the UK's child safety rules for tech companies?

The Online Safety Act aims to make social media firms and search engines protect children and adults in the UK from illegal, harmful material.

It became law in 2023, with duties for platforms coming into effect in 2025.

The Act will require platforms to show they are committed to removing illegal content, including:

  • Child sexual abuse

  • Controlling or coercive behaviour

  • Extreme sexual violence

  • Promoting or facilitating suicide or self-harm

  • Animal cruelty

  • Selling illegal drugs or weapons

  • Terrorism

Pornography sites will have to stop children viewing content, by checking ages.

Duties to protect children from harmful content, also include addressing harms disproportionately affecting women and girls, such as intimate image abuse and harassment.

The Act has also created new offences, such as:

  • Cyber-flashing - sending unsolicited sexual imagery online

  • Sharing "deepfake" pornography, where artificial intelligence is used to insert someone's likeness into pornographic content

It also makes it easier for bereaved parents to obtain information about their children from technology companies.

Ofcom, the regulator tasked with enforcing the Act, has been given additional powers to ensure companies comply with the rules.

It requires online platforms to assess if and where users - particularly children - may be exposed to certain types of illegal or harmful content on their services.

Platforms must then detail measures to prevent this, in accordance with Ofcom's codes and guidance.

Ofcom's chief executive Dame Melanie Dawes has warned that services failing to follow the rules could have their minimum user age raised to 18.

Why has the Online Safety Act been criticised?

Some parents of children who died after exposure to harmful online content have called the Online Safety Act "insufficient" and criticised its delays.

Bereaved parents including Ian Russell, father of Molly, and Esther Ghey, mother of Brianna, have said the legislation should impose tougher rules and a duty of care on tech firms.

Esther GheyImage source, Getty Images
Image caption,

Esther Ghey wants technology companies to make it harder for young people to access potentially harmful material online

In August 2024, following riots across the UK, London Mayor Sadiq Khan said the Act, external did not adequately tackle misinformation.

Some civil liberties groups meanwhile believe its content measures risk stifling free expression online.

Requirements for porn sites to use technology for "robust age checks" have also raised privacy and security concerns.