One in three spot hate speech in online videos
- Published
One in three people spotted hate speech in online video platforms in the past three months, a report by Ofcom found.
Racist content was most frequently seen, but religious discrimination, transphobic and homophobic content were also common, it said.
The media regulator is responsible for policing rules about harmful content on video sites.
Ofcom's guidance said it expects platforms to “take active measures” to fix the issue.
That includes making sure the rules banning harmful content are clear, and having better and more transparent reporting systems.
The rules apply to all sorts of video-sharing sites, not just mainstream giants such as YouTube.
Ofcom also said that video sites hosting pornography “should put in place effective age-verification systems to restrict under-18s’ access to these sites and apps.”
A previous attempt by the government to introduce mandatory age-verification checks for pornography collapsed in 2019 after repeated delays and obstacles.
Breaches of the rules could lead to fines or even the suspension of the service in the UK, Ofcom warned.
But the guidance remains a proposed draft, with changes still to be made. And the rules governing video-sharing sites are likely to be superseded by the long-delayed Online Harms Bill when it comes into effect.
Ofcom report
In its research into the prevalence of hateful content online, Ofcom found 32% of respondents had seen or been subjected to hate speech. Of those:
59% said they had seen it directed towards a racial group
28% against a religious group
25% against transgender people
23% against a specific sexual orientation
Those numbers exceed 100% because one person may have seen several different types of abuse.
Bill Howe from Stop Hate UK said that Ofcom's findings are "unsurprising", based on anecdotes the charity hears - and that the split of what kinds of hateful speech were reported "also feels accurate and to some extent mirrors" offline hate incidents reported to the charity.
"While mindful of the need to preserve freedoms of speech and expression, in general we welcome attempts to address the issue of the proliferation of hateful material online," he said.
But he added that regulation and enforcement "can only be considered" as part of a wider education or citizens' programme to address under-reporting of hate speech and other issues.
Users were also not likely to know about safety systems that a video site had in place – with six in 10 saying they did not know about them, and only a quarter of people ever having used them.
'Potentially harmful'
The research was carried out between September and October last year, asking more than 2,000 respondents about the previous three months.
Ofcom’s head of online content Kevin Bakhurst said that the explosion of popularity in video content does not mean that it is “without risk”.
“Many people report coming across hateful and potentially harmful material,” he warned.
“Although video services are making progress in protecting users, there’s much further to go. We’re setting out how companies should work with us to get their houses in order – giving children and other users the protection they need, while maintaining freedom of expression.”
Related topics
- Published15 December 2020
- Published1 February 2021
- Published16 October 2019