Rape posts every half-hour found on online incel forum
- Published
Discussions on a major forum for incels are growing more violent, a new study, external by the Centre for Countering Digital Hate (CCDH) has warned.
Incels - short for Involuntary Celibate - hold misogynistic beliefs, and some have launched violent attacks.
The CCDH research suggests that, on average, a post about rape was published to the forum every 29 minutes.
The report's authors want tech firms to do more to counter the problem.
For 18 months, researchers from CCDH's new Quant Lab used machine learning to help analyse in excess of a million posts on the forum.
The report does not name the forum - to avoid giving it publicity - but says it is the largest online, with more than 17,000 members and 2.6 million visits a month. Women are barred.
The forum is one of a number of online incel communities "promoting a hateful and violent ideology linked to the murder or injury of 100 people in the last 10 years, mostly women", the researchers write.
According to one definition, incels are men who are "unable to find sexual partners despite wanting them, and who express hate towards people whom they blame for this".
To the CCDH researchers' alarm, violent rhetoric on the forum increased significantly during the study, with posts mentioning incel mass murders rising by 59% between 2021 and 2022.
While the security service MI5 has said, external most incels are not violent, the movement has been linked to a number of attacks.
In Isla Vista, California, in 2014, Elliot Rodger murdered six people and injured 14 before killing himself,
Rodger left a 137-page manifesto detailing his wish "to punish everyone who is sexually active", and to bring about the "second phase" of a "War on Women".
'Coherent movement'
In 2021, a gunman killed five people in a suburb of Plymouth in Devon.
The alleged perpetrator, 22 year-old Jake Davison, is thought to have been active on social media discussing the incel movement. Bereaved families urged the government to take action against incel culture.
CCDH chief executive Imran Ahmed told the BBC: "This is a coherent movement of men who not just pose a threat of mass attacks, but also a significant threat to any woman in their lives."
Mr Ahmed said their research showed acts of incel violence should not be treated as the work of lone wolves.
"They are not socially isolated. They're embedded in a community of fellow travellers. And there is a well-developed, constantly negotiated, constantly discussed ideology that underpins their belief system and underpins their actions."
UK agencies do not currently define incel beliefs as a terrorist ideology, but say it is possible that some individual acts may cross the threshold and be regarded as single-issue terrorism.
But a spokesperson for Counter Terrorism Policing said they took the threat from incels seriously, adding: "We have resources dedicated to analysis and assessment of the threat this ideology poses to the public."
The report also considers users' support for acts of sexual violence.
Its analysis suggests that the clear majority of posts about rape were supportive, similarly "discussions of paedophilia show 53% of posters are supportive".
Keywords associated with paedophilia occurred in posts made by "1,143 unique users, or 28% of the active users in our dataset", the researchers said.
Mr Ahmed fears there may be many unreported acts of violence that can be linked to the incel movement.
"It's almost certain that offences have been committed, that women have been harmed, and we haven't attributed that to incels in the past because we focused on the mass attacks.
"We've also got a focus on the wider threat to women and girls we discovered in this country."
Analysis
By Marianna Spring, disinformation and social media correspondent
Misogyny online sits along a spectrum - and some of these posts on incel forums would be considered the most extreme. This report tells us how the worst appear to be getting worse.
It's in keeping with a broader escalation in violent rhetoric we've seen from groups on social media, whether linked to conspiracies or far-right ideologies.
More specifically, it seems to fit into a wider pattern of increasing anti-women hate online.
This was a topic I investigated for BBC Panorama, exposing how social media algorithms can push anti-women hate to accounts that have started to express an interest in this kind of content.
The dummy account I set up was driven towards content that discussed sexual violence and condoned rape. It suggested that, if you're already susceptible to misogyny online, algorithms can make it worse.
Anecdotally, I've noticed how abuse sent to me online more and more frequently includes threats of sexual violence. Many women I've interviewed and spoken to about trolling have expressed concerns over how the same kind of rhetoric they're seeing in messages online could translate into action offline.
Teenage boys are being drawn into the movement, with the researchers finding young forum members who expressed an interest in violence.
A user claiming to be 15 discussed wanting to "go ER" in one post, a phrase that refers to emulating the Isla Vista killer, in committing a mass shooting.
Another user, who described themselves as a school student, said he had also considered going "ER". He asked for help from other forum members after claiming to have been flagged to Prevent - the UK counter-extremism programme - for carrying a knife in his school bag.
Other users offered him advice on using a virtual private network to avoid surveillance, and congratulated him on a decision to stop taking anti-psychotic medication, researchers found.
Online safety
The report makes a number of recommendations, several of which urge big tech to deplatform incel sites, including:
Google should make sure incel sites are deranked in its search results
Incel YouTube channels should be deplatformed
Infrastructure companies such as Cloudflare should not provide services to incel sites
Authorities should design and implement anti-extremism interventions for incels
According to Google, its search ranking systems are designed to avoid exposing people to hateful and harmful content if they are not explicitly looking for it.
YouTube policies prohibit hate speech, harassing and sexual content and, on review, it said it had removed or age-restricted several videos on two of the channels cited in the CCDH report.
The CCDH has in the past suggested that the Online Safety Bill could be an important tool in tackling harmful incel content. However, ministers have recently said rules that would require tech firms to address material that is harmful but legal on their platforms will be changed.
Mr Ahmed urged the government not to water down these provisions in the bill.
"We've got to be not just tough on terrorists, but tough on the causes of terrorism," he said.
"Here we can see a community that is very cautious about never tripping the line into actual criminal behaviour online, but nevertheless, collectively pose an incredibly serious threat.
"So it's absolutely crucial that government stick to their guns in a piece of legislation that's already had more scrutiny than almost any bill normally does."
A DCMS spokesperson told the BBC that the main aim of the Online Safety Bill is to protect children and tackle abhorrent criminal activity online.
"Hate crime, child sexual exploitation and encouraging violence or rape are illegal and all companies in scope - including small websites and chatrooms - will have to remove and limit people's exposure to this content or face massive fines from Ofcom".
- Published17 May 2022
- Published13 August 2021