Facebook's fake news experiment backfires

  • Published
Screen grab of Independent story and commentImage source, Facebook
Image caption,

A variety of stories, from some questioning the monarchy to others about the Great British Bake Off, promoted comments questioning their truthfulness

A Facebook test that promoted comments containing the word fake to the top of news feeds has been criticised by users.

The trial, which Facebook says has now concluded, aimed to prioritise "comments that indicate disbelief".

It meant feeds from the BBC, the Economist, the New York Times and the Guardian all began with a comment mentioning the word fake.

The test, which was visible only to some users, left many frustrated.

The comments appeared on a wide range of stories, from ones that could be fake to ones that were clearly legitimate. The remarks, which would appear at the top of the comments section, came from a variety of people but the one thing that they had in common was the word fake.

"Clearly Facebook is under enormous pressure to tackle the problem of fake news, but to question the veracity of every single story is preposterous," said Jen Roberts, a freelance PR consultant.

"Quite the reverse of combating misinformation online, it is compounding the issue by blurring the lines between what is real and what isn't. My Facebook feed has become like some awful Orwellian doublethink experiment."

Many on Twitter also expressed annoyance.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post by Margo B

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post by Margo B
This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post 2 by joanna barrett

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post 2 by joanna barrett

In a statement, Facebook told the BBC: "We're always working on ways to curb the spread of misinformation on our platform, and sometimes run tests to find new ways to do this. This was a small test which has now concluded.

"We wanted to see if prioritising comments that indicate disbelief would help. We're going to keep working to find new ways to help our community make more informed decisions about what they read and share."

Facebook has been under enormous pressure to deal with the issue of fake news since it was singled out as one of the main distribution points for hoax stories during the US presidential election.

In August it promised to step up its efforts to fight fake news by sending more suspected false stories to fact-checkers.

It also launched a new feature that published alternative news links beneath suspect articles.