EU struggles over law to tackle spread of terror online

  • Published
The logo of the so-called Islamic State is seen on a mobile phoneImage source, Getty Images

EU officials are struggling to agree on a law aimed at preventing the spread of "terrorist content" online.

The European Parliament approved a draft version of the law on Wednesday evening, which would impose a one-hour deadline to remove offending content.

But a European Commission official told the BBC changes made to the text by parliament made the law ineffective.

It now plans to agree a version closer to the original with a new parliament after the elections in May.

"Given the importance, we have to come back and work on this again with them," the official said.

The law would affect social media platforms including Facebook, Twitter and YouTube, which could face fines of up to 4% of their annual global turnover.

What does the law say?

The legislation, proposed by the Commission last year, gives internet companies one hour to remove offending content after receiving an order from a "competent authority" in an EU country.

"Terrorist content" includes material that incites or advocates for terrorist offences, promotes the activities of a terrorist group or teaches terrorist techniques, according to the draft text.

In the original text, companies are also expected to take "proactive measures" to stop the spread of terrorist content.

This includes using automated tools to prevent content that has previously been removed from being re-uploaded.

Under the rules, companies operating in the EU could face hefty financial penalties if there is a "systematic failure" to comply.

You may also be interested in

What were the changes?

In its amendments, the European Parliament said websites would not be forced to "monitor the information they transmit or store, nor have to actively seek facts indicating illegal activity".

It said the "competent authority" should give the website information on the procedures and deadlines 12 hours before the agreed one-hour deadline the first time an order is issued.

It also ruled that authorities dealing with "terrorist content" being posted in another EU member state should contact officials in that country, rather than dealing directly with the website.

"This is a strong position from the parliament which ensures that there will be a one-hour deadline to remove content. It also ensures safeguards for smaller platforms, ensures that there are no upload filters and preserves freedom of speech," MEP Daniel Dalton, the rapporteur for the proposal, told the BBC.

'Christchurch test'

The Commission official, speaking on condition of anonymity, told the BBC that the latest version did not pass the "Christchurch test" of whether it would have been effective at stopping the spread of content during last month's New Zealand mosque attacks, which was live-streamed on Facebook.

"We don't think the amendments provide for effective measures," he said.

Mr Dalton, however, told the BBC it was "the only possible compromise agreement which could have got through the parliament".

"If the Commission thinks they can make significant changes to this proposal and get it through the parliament then they don't know this parliament very well," he said.

"The Commission proposal simply didn't have a majority in the parliament as it raised too many questions. It also didn't properly address the legality of cross border removal orders. The modifications we have made ensured that it could pass, whilst ensuring the key elements of the Commission proposal remain".

Image source, Getty Images
Image caption,

Tributes in memory of the mosque attack victims in Christchurch, New Zealand

What about free speech?

Others said the amendments did not go far enough in protecting free speech.

In February, German MEP Julia Reda of the European Pirate Party said the legislation risked the "surrender of our fundamental freedoms [and] undermines our liberal democracy.", external

Ms Reda welcomed the changes brought by the European Parliament but said the one-hour deadline was "unworkable for platforms run by individual or small providers.", external

She argued that pressure to keep such content offline would result in companies using automated filters that were "bound to lead to the deletion of legal uploads".

Does terrorism spread online?

David Ibsen, executive director of the Counter Extremism Project, said "the easy availability of terrorist content online continues to have a huge impact on radicalisation, recruitment, and incitement to violence.

"Police investigations have repeatedly found a critical link between radicalising content online and terror attacks. Nice, France, the Bataclan concert hall attack in Paris, and the Manchester arena bombing are but a few examples of how individuals can be radicalised online."

European Commissioner for the Security Union Sir Julian King described the spread of terrorist content online as "a clear and present danger that needs to be stamped out".

"It has had a role to play in every single attack on European soil in the last few years, whether through incitement to commit an attack, instruction on how to carry it out or glorification of the deadly results," he told the BBC.

"The potential damage caused by terrorist content online rises dramatically with every hour it remains online, spreading from platform to platform. And it's not only Da'esh [IS]: other jihadist groups such as al-Qaeda are still a threat, as are violent right-wing extremists."

He said the Commission "look forward to continuing our work with MEPs and with the Council in the coming months to find an effective way forward on this critical security file".