Ofcom: Clear link between online posts and violent disorder
- Published
There was a "clear connection" between the violent disorder in England and Northern Ireland in the summer and posts on social media and messaging apps, Ofcom has concluded.
The government had asked the media regulator to consider how illegal content and disinformation spread during the unrest.
In an open letter setting out its findings, external, Ofcom boss Dame Melanie Dawes said such content spread "widely and quickly" online following the stabbings in Southport, in July, which preceded the disorder.
She added most online services took "rapid action", but said the responses of some firms were "uneven".
"Posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period," Dame Melanie wrote.
The BBC approached major tech platforms for their response to the letter.
X, formerly Twitter, told BBC News some accounts were suspended and other content was removed from the platform following the riots.
A spokesman from the messaging app Telegram said they "immediately removed UK channels that called for violence as they were discovered in August".
None of the other major tech platforms responded to the BBC's request for comment.
Experts say the unrest showed the power - and responsibility - social media platforms have.
"Ofcom is saying that social media posts inciting riots are not just words - they play a big part in fanning the flames of disorder," said Rashik Parmar, from BCS, the Chartered Institute for IT.
"There should be accountability where platforms allow dangerously divisive content to go unchecked," he added.
Media analyst Hanna Kahlert, at Midia Research, said Ofcom's findings amounted to a "call for social platforms to take greater ownership of the impact of content".
Enhanced powers
At the time of the unrest, Ofcom faced criticism for not doing more to rein in the spread of untrue and inflammatory content.
It urged tech firms to take action - but also pointed out the enhanced powers it is due to get under the Online Safety Act had not yet come into force.
The act will see the creation of codes of practice for big tech firms which will place new responsibilities on them for tackling disinformation.
"I am confident that, had the draft codes been in force at the time, they would have provided a firm basis for urgent engagement with services on the steps they were taking to protect UK users from harm," Dame Melanie wrote.
She said the new powers set "clear standards" for what Ofcom would expect to see in future from big tech firms, such as:
Specifying in their terms of service provisions how individuals are to be protected from priority illegal content
Having systems designed to swiftly take down illegal content and having "adequately resourced" content moderation teams
Providing effective and accessible mechanisms for users to complain about illegal content, including on messaging platforms
The unrest which broke out in August this year was the worst that had been seen in the UK for a decade.
It was followed by waves of arrests and prosecutions, some for online offences.
The role that big tech played was subject to much scrutiny - though the platforms themselves remained largely silent at the time.
The prime minister also got dragged into a war of words with one of the highest profile people in tech - X owner Elon Musk.
The tech billionaire suggested that "civil war is inevitable" following the disorder.
Sir Keir Starmer hit back saying there was "no justification" for Mr Musk's comments, adding there was more that social media companies "can and should be doing".
A spokesman from X told the BBC: "X monitored the platform and actioned thousands of pieces of content as part of our internal incident response protocols."
The X spokesman said some accounts were suspended and other content was removed, but they didn’t specify any more details.
The spokesman also highlighted the role of community notes. He said this "played an important role in addressing misleading content across the platform in relation to the incidents and these have been viewed millions of times".
- Published10 August
- Published17 October
- Published7 August