Google search changes tackle fake news and hate speech

  • Published
Google logo on a signImage source, AFP
Image caption,

Google said the changes should stop fake news and hate speech dominating results

Google is changing the way its core search engine works to help stop the spread of fake news and hate speech.

The changes involve different measures for ranking sites and people checking results are accurate.

In a blog, Google said the changes should thwart attempts to abuse its algorithms that let extremists promote their content.

Google was criticised last year for giving prominence to groups seeking to deny that the Holocaust took place.

Rate and replace

Ben Gomes, a vice-president of engineering at Google's search division, said it was making "structural" changes to tackle the new ways people had found to trick its algorithms.

In particular, he said, many groups and organisations were using "fake news" to help spread "blatantly misleading, low quality, offensive or downright false information".

To combat this, he said, Google had added new metrics to its ranking systems that should help to stop false information entering the top results for particular search terms.

In addition, he said, it had updated the guidelines given to the thousands of human raters it used to give feedback on whether results were accurate.

The guidelines included examples of low quality and fake news websites, said Mr Gomes, to help them pick out "misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories".

Analysis: Rory Cellan-Jones, BBC technology correspondent

Google has done its best to play down the extent of fake news and hateful material - or what it prefers to call "low quality content" - in search results.

The company keeps repeating that this affects only 0.25% of queries.

But the fact that searches such as "Is Obama planning a coup?" - or even "Who invented stairs?" - produced such questionable results meant it had to act.

These searches threw up a prominent "snippets" box telling you that, yes, President Obama was planning a coup, or that stairs had been invented in 1948.

Now both boxes have gone, and Google's almighty algorithm has been tweaked so that such content is less likely to rise to the top.

What's interesting is that a company that has put such faith in technology solutions is turning to 10,000 humans to try to make search a better experience.

This giant focus group, which tests out changes in the search algorithm, has been told to pay more attention to the source of any pages rated highly in results, looking round the web to see whether they seem authoritative and trustworthy.

Questions are bound to be raised about whether this panel, which Google says is representative of its users, is impartial and objective.

Google's Ben Gomes, a veteran who's been wrestling with the intricacies of search since arriving as one of the earliest employees, believes it is now on the path to getting this right.

But with so many people trying to game the system, the battle to make search true and fair will never be over.

Google also planned to change its "autocomplete" tool, which suggests search terms, to allow users to more easily to flag up troubling content, he said.

Danny Sullivan, founder of the Search Engine Land news site, said the changes made sense and should not be taken to suggest that Google's algorithms were failing to correctly index what they found online.

"It's sort of like saying that a restaurant is a failure if it asks for people to rate the food it makes," he said.

"The raters don't rank results," said Mr Sullivan.

"They simply give feedback about whether the results are good.

"That feedback is then used to reshape the algorithms - the recipes, if you will -that Google uses."