Google launches UK 'anti-terror fund'
- Published
Google has announced it will give a total of £1m ($1.3m) to fund projects that help counter extremism in the UK.
The UK fund was part of a $5m global commitment, the search giant said.
Prime Minister Theresa May wants technology companies to be more proactive in shutting down spaces where extremist ideology is shared.
In a speech at the United Nations general assembly, she will challenge tech firms to take down terrorist material in two hours.
She is expected to urge Google, Facebook and Twitter to go "further and faster" in developing artificial intelligence tools that can spot terror propaganda.
On the same day, Twitter announced that it had taken down 300,000 terror accounts in the first six months of the year as its own AI tools improve.
Google's funding will be handed out in partnership with the Institute for Strategic Dialogue (ISD), a UK-based counter-extremist organisation.
An independent advisory board including academics, policymakers, educators, representatives from creative agencies, civil society and the technology sector will be accepting a first round of applications in November, with grants of between £2,000 and £200,000 awarded to successful proposals.
ISD chief executive Sasha Havlicek said: "We are eager to work with a wide range of innovators on developing their ideas in the coming months."
Over the next two years, the wider Google funding pot would support "technology-driven solutions, as well as grassroots efforts such as community youth projects that help build communities and promote resistance to radicalisation", the search giant said.
"By funding experts like ISD, we hope to support sustainable solutions to extremism both online and offline. We don't have all the answers, but we're committed to playing our part. We're looking forward to helping bring new ideas and technologies to life," said Kent Walker, general counsel at Google.
In March, the UK government suspended its adverts from YouTube, following concerns they were appearing next to inappropriate content.
In June, YouTube announced four new steps it was taking to combat extremist content:
Improving its use of machine learning to remove controversial videos
Working with 15 new expert groups, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue.
Tougher treatment for videos that are not illegal but have been flagged by users as potential violations of its policies on hate speech and violent extremism
Redirecting people who search for certain keywords towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages
- Published5 June 2017
- Published17 March 2017