Facebook employs UK fact-checkers to combat fake news
- Published
Facebook has employed a UK fact-checking service to help it deal with the spread of fake news.
Full Fact, a charity founded in 2010, will review stories, images and videos and rate them based on accuracy.
It said that it will focus on misinformation that could damage people's health or safety or undermine democratic processes.
Facebook said it was working "continuously" to reduce the spread of misinformation.
Sarah Brown, training and news literacy manager at Facebook, said: "People don't want to see false news on Facebook, and nor do we. We're delighted to be working with an organisation as reputable and respected as Full Fact to tackle this issue.
In a blogpost, external, Full Fact explained that users can flag up content they think may be false and its team will rate the stories as true, false or a mixture of accurate and inaccurate content.
It will only be checking images, videos and articles presented as fact-based reporting. Other content, such as satire and opinion, will be exempt.
If something is found to be fake, it will appear lower in the news feed but will not be deleted.
It will be tackling "everything from dangerous cancer 'cures' to false stories spreading after terrorist attacks, to fake content about voting processes ahead of elections", the charity said.
"This isn't a magic pill. Fact-checking is slow, careful, pretty unglamorous work and realistically we know we can't possibly review all the potentially false claims that appear on Facebook every day. But it is a step in the right direction."
Facebook has faced accusations from politicians in the US and the UK that it is helping spread misinformation that can have an effect on the way people vote.
The Brexit referendum and the 2017 general election were both found to have been tarnished by fake news, and social media firms have been threatened with regulation if they fail to do something about the issue.
Four million views
Chief executive Mark Zuckerberg appeared before the US Congress in April to talk about how Facebook is tackling false reports, but has so far failed to respond to requests from a UK parliamentary inquiry to answer its questions face to face.
The social network now works with fact-checkers in more than 20 countries.
To illustrate the problem that fact-checking services and social networks face, the BBC has learned that a video went live last weekend falsely suggesting that smart meters emit radiation levels that are harmful to health.
The original video had more than four million views in four days before being taken down.
Sacha Deshmukh, chief executive at campaigners Smart Energy GB, said: "Smart Energy GB welcomes the news today from Facebook to take action on the phenomenon of misleading videos on its platform in the UK.
"But for this to work effectively, Facebook must guarantee that the speed of action will match the speed with which such misleading stories spread.
"Facebook provided an environment in which, over 72 hours, the video attracted more than four million views.
"This is clearly unacceptable and, moving forward, the challenge for Facebook is whether their new system will be nimble enough to swiftly fact-check misleading content and protect their users."
- Published10 January 2019
- Published13 July 2018