Researchers warn of rise in extremism online after Covid

  • Published
Illustration of computer crime using a hooded character at a laptopImage source, Getty Creative Stock

Researchers across the world are increasingly finding ways in which the coronavirus pandemic changed us. As more data comes in, what impact have lockdowns and their aftermath had on the spread of extremism?

The United Nations painted a sobering picture of how the pandemic has fuelled terrorism and violent extremism, in a report last year, external.

Heavy-handed enforcement of Covid restrictions, growing economic inequalities and an "erosion of trust in government", coupled with a diversion of resources away from fighting terrorism, were among the factors driving the change, according to the UN.

Jacob Davey, a researcher at the Institute for Strategic Dialogue (ISD), a UK counter-extremism think tank, also identified "quite significant spikes in extremist activity and also conspiracy theories" during the pandemic.

Media caption,

What's behind Telegram's growing popularity?

"It might be people spending more time on their computer," he says, but there was also "a heightened sense of anxiety". Conspiratorial views and talking points "provide easy answers" to people who are worried, he argues.

He singles out messaging app and social network Telegram - which grew rapidly during the pandemic and claims to have more than 500,000 members around the world - as a hub for "disinformation through to conspiracy theories, through to terrorist activity".

Easily searchable groups on Telegram mean more extreme networks can feed into seemingly innocent groups.

"There's still a large amount of extremist content, still a large amount of disinformation content, enforcement is not great, but we have seen them over the past 10 years start to take a bit more action," he says.

"High-profile influencers are slowly but surely being kicked off Telegram."

"Telegram's very unmoderated approach has meant that it functions as a centralising hub for a lot of different groups and individuals," says Dr Tim Squirrell, who also works at the ISD, "you can get kicked off Facebook or the mainstream social and video sharing platforms," but "you can still maintain your audience on Telegram".

Dr Squirrell says accounts which were removed over the QAnon conspiracy theory on Twitter rapidly "flooded over to Telegram".

Media caption,

US Rapper Nick Nittoli shares his thoughts on QAnon and the US Capitol riots ‘So embarrassed’

But, he warns, for those using Telegram thinking it's a secure and encrypted app, he says it isn't.

When Telegram decides to take action, such as against Islamic State group accounts, "it's actually pretty effective," and those accounts have "found it really difficult to re-establish a foothold".

Mr Davey says there are "hundreds" of right-wing extremist groups operating in English language channels, with users from Australia, the US and Britain.

One group, he says, saw themselves "as essentially resistance against Covid measures," including vaccines. It has 247 affiliated regional groups on Telegram, 45 of which focus on the UK, the main channel has over 62,000 subscribers.

"At last count," Mr Davey says, "my list of white supremacist channels was at something like 650, reaching hundreds of thousands of people".

"We've seen pandemic related spikes," Mr Davey says, most of these extremist groups saw a sudden rise during the pandemic.

Telegram itself, however, has become a battleground in the Russian invasion of Ukraine, because the app is heavily used in both countries, and due to their lax moderation standards, disinformation is rife.

A spokesperson for the app told us "Telegram is a platform for free speech, including that we do not agree with. It is for this reason that Telegram is used to organize pro-democracy protests in places like Hong Kong, Belarus and Iran.

"However, calls to violence are explicitly forbidden by Telegram's terms of service. Our moderators proactively monitor public parts of the app as well as accepting user reports in order to remove content that breaches of our terms."

The pandemic might have eased up, but the online conspiracy movement that boomed during it - including here in the UK - has far from disappeared.

Instead, those who thought Covid-19 was a hoax and that the vaccine was part of a genocidal plot - false claims different from legitimate concerns and political criticisms - have turned their energy elsewhere.

Now, climate change and the war are a hoax, too - and terror attacks and tragedies never really happened. They all employed crisis actors. Violent rhetoric about treason and hangings continues to accompany these conspiracies, directed at public figures, experts and anyone who disagrees with the new narrative.

Far-right groups have sought to exploit this sizeable, captive online audience - sharing their beliefs on what were once the most prominent Covid-19 conspiracy channels on Telegram.

Protests linked to these groups haven't disappeared, either. Rallies this summer have attracted many of the same conspiracy influencers who were present at anti-lockdown marches. This time they're promoting disinformation about child grooming, while being applauded by far-right figures online.

The UK's conspiracy movement isn't going anywhere - and it has a real-world impact. It harms those who are victims of this disinformation and hate, but it also poses a threat to its followers, now deep down a rabbit hole and vulnerable to extreme, false ideas.

Image source, Getty Creative Stock
Image caption,

Social networks like Twitter have become an "outrage farm" Jacob Davey says

"More than 97%" of news traffic during the pandemic "went to trustworthy sites," Rasmus Nielsen, the Director of the Reuters Institute for the Study of Journalism, says the BBC alone benefited from a "30% increase in web visits".

He thinks sometimes people think there is a "tsunami of misinformation, and there's no question that it's out there, there is plenty of misinformation. But when we look at what people actually engage with, our data suggests that the vast majority of it is from trustworthy sources".

"There are people who publish things that are at best misleading and tenuous, and at worse, outright fabrications or false," he continues, "anyone can do it if they put their minds to it".

Even in America, he says, where the problem is more pronounced, "average Americans spend about 100 times more time with actual news as they do with identified false or fake news".

Image source, Getty Creative Stock
Image caption,

Children are becoming increasingly likely to be victims of extremist grooming

Latest Counter Terrorism data seems to show a trend upwards in children being exposed to extremist grooming.

This is happening at the same time, they say, that referrals to the Prevent programme have reduced largely due to schools being closed during the pandemic.

In 2020, 25 children were arrested in relation to terrorism offending, the highest number ever recorded in a 12-month period.

Counter Terrorism Policing, an organisation headed by the Metropolitan Police, allowing police forces to co-operate on the threat of terrorism, said they were "working around the clock to tackle the proliferation of content of the internet... every day, investigators from the Counter Terrorism International Referral Unit trawl the internet, identifying terrorist material and working with service providers to ensure it's removed."

They said the pandemic "has undoubtedly changed the landscape when it comes to radicalisation, particularly in the online world, as well as exacerbating challenging circumstances and grievances within society that terrorists use to promote their brands of hatred, or extremism - it has made us all more isolated."