Stop suggesting children as friends, social media firms told
- Published
Social media platforms should fight online grooming by not suggesting children as "friends" by default, the communications watchdog says.
The warning is contained in Ofcom's first guidance for tech platforms on complying with the Online Safety Act.
This covers how they should tackle illegal content, including child abuse online.
Ofcom revealed figures suggesting that over one in ten 11-18 year olds have been sent naked or semi-naked images.
This first draft code of practice published by Ofcom in its role enforcing the Online Safety Act covers activity such as child sexual abuse material (CSAM), grooming and fraud.
It wants to hear what tech platforms think of its plans.
Much of the guidance targets grooming. The largest platforms are expected to change default settings so children aren't added to suggested friends lists, something that can be exploited by groomers.
They should also ensure children's location information cannot be revealed in their profile or posts and prevent them receiving messages from people not in their contacts list.
Depending on their size, type, and the risk they present platforms should also:
Make sure moderation teams have the resources they need
Make it easy to for people to make a complaint to the platform
Remove accounts run by or on behalf of terrorist organisations
Use keyword searches to find content linked to fraud such as stolen passwords
Identify content containing the addresses of child abuse websites
Search engines should not index websites previously identified as hosting child abuse material
Search engine users should have a way to report "search suggestions" they believe are pointing them to illegal content.
Ofcom will also require some platforms to use a technology called hash-matching to detect CSAM.
This converts an image into numbers called a "hash", and compares that with a database of numbers generated by known CSAM images. If there is a match, then it means a known CSAM image has been found.
The method is already widely used by social media and search engines, according to Professor Alan Woodward of Surrey University.
"I fear Ofcom are simply codifying mechanisms that are already in use. It's not surprising as research to date has found nothing more effective than what is in use already", he told the BBC.
Private messages
But this hashing will not apply to private or encrypted messages. Ofcom stresses it is not - in this guidance - making any proposals that would break encryption.
Powers in the bill that could, if certain conditions are met, be used to force private messaging apps such as iMessage, WhatsApp and Signal to scan messages for CSAM have been deeply controversial.
These apps use end-to-end-encryption, which means even the tech firm cannot read the contents of the message.
Some major apps have said they will not comply if asked to scan encrypted messages - arguing it would require them to weaken the privacy of their systems globally, and weaken the security of systems that protect users including children.
Ofcom says those powers will not be consulted on until 2024 and are unlikely to come into force until around 2025.
Some question whether it will ever be possible to enforce these powers in a way that preserves the privacy of encrypted communications.
Asked in a BBC interview if those powers would ever be used, Ofcom's chief executive Dame Melanie Dawes said, "it's hard to say right now, but there isn't a solution yet, a technology solution, that allows scanning to take place in encrypted environments without breaking the encryption."
But she encouraged encrypted messaging companies to find ways to combat child abuse on their platforms.
Great expectations
The challenge facing Ofcom is significant. This first guidance is over 1,500 pages long. Over 100,000 services, many based outside the UK, may be subject to regulation.
And government figures have suggested that 20,000 small business could need to comply.
Asked if Ofcom had the resources it needed, Dame Melanie admitted it was a "really big job" but added "we're absolutely up for the task. And we're really excited that we're launching today."
It faces another challenge managing expectations from the public and from campaigners. Whatever Ofcom announces it may be criticised for being too hard on tech platforms or not hard enough, said Dame Melanie.
"It isn't the job of a regulator to be loved by everybody. That's impossible.
"And it's not what we ever aim for, but it is our job to be proportionate. And to make sure that what we require is evidenced and has been backed up by proper facts", Dame Melanie added.
And one expectation Ofcom is keen to dismiss is that harmful content should be reported directly to it - instead its task is to make sure the tech-firms have good systems for users to report illegal or harmful content to them.
"So this isn't like TV [complaints] where you can submit a complaint to Ofcom, and we will consider it as the regulator", Dame Melanie said.
Sign up for our morning newsletter and get BBC News in your inbox.
Related topics
- Published26 October 2023
- Published25 October 2023
- Published20 September 2023