Facebook gives users trustworthiness score

  • Published
FacebookImage source, PA/Getty
Image caption,

Facebook disputes the idea the score should be considered to be a reputation rating

Facebook has confirmed that it has started scoring some of its members on a trustworthiness scale.

The Washington Post revealed, external that the social network had developed the system over the past year.

The tech firm says it has been developed to help handle reports of false news on its platform, but it has declined to reveal how the score is calculated or the limits of its use.

Critics are concerned that users have no apparent way to obtain their rating.

The BBC understands that at present only Facebook's misinformation team makes use of the measurement.

The tech firm has, however, objected to the tool being described as a reputation rating.

"The idea that we have a centralised 'reputation' score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading," said a spokeswoman.

"What we're actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system.

"The reason we do this is to make sure that our fight against misinformation is as effective as possible."

Fact-checking

The Washington Post's report was based on an interview with Facebook executive Tessa Lyons about the platform's battle against "fake news".

It reported that she said it had been developed to improve a fact-checking scheme begun in 2016, external, in which posts that Facebook users flag as being false are sent to third-party organisations to decide if they should appear lower in people's news feeds.

To make the system more efficient, she explained, her team wanted to know which flaggers were themselves trustworthy.

"People often report things that they just disagree with," Ms Lyons explained, adding that the system gave users a score between zero and one.

The BBC understands that this is calculated in part by correlating the false news reports with the ultimate decisions of the independent fact-checkers.

So, someone who makes a single complaint that is substantiated gets a higher score than someone who makes lots of complaints, only some of which are determined to be warranted.

But the Washington Post said Ms Lyons had declined to say what other signals were being used to avoid tipping off bad actors about how to game the system.

'Automated and opaque'

Facebook is far from being the first tech firm to privately score its users.

Uber has long rated its customers according to the marks each driver gives them. But originally it required clients to email in requests to find out their count before it decided to make the information available via its app.

Twitter's co-founder Ev Williams said in 2010 that it gave users, external a secret "reputation score" to help it recommend which members to follow.

The Chinese state is also piloting a system in which citizens are given a "social credit" score based on a mix of their online and offline behaviour.

Image source, Facebook
Image caption,

Facebook used to mark "disputed stories" with a red icon but said the effort had backfired

But one expert said that Facebook's use of an "automated and opaque" scoring system raised particular concerns.

"It's unsurprising that Facebook would want to assess the credibility of its users, considering how some of them are highly suspicious, gullible or out to deliberately misinform others," said Dr Bernie Hogan, from the Oxford Internet Institute.

"But consider the analogy of one's credit score.

"You can check your credit score for free in many countries - by contrast, Facebook's trustworthiness is unregulated and we have no way to know either what our score is or how to dispute it.

"Facebook is not a neutral actor and despite any diplomatic press materials to the contrary, it is intent on managing a population for profit."

Civil rights campaigners have also been critical.

"This is yet another example of Facebook using people's data in ways they would not expect their data to be used, which further undermines people's trust in Facebook," said Ailidh Callander, a solicitor at Privacy International.

"Facebook simply must learn some hard lessons, and start being transparent and accountable about how they use people's data to profile and take decisions."

It is not clear if the scoring system is applied to EU citizens.

But Facebook needs to abide by the recently introduced General Data Protection Regulation (GDPR).

"Under new data protection laws, organisations providing services to people in the UK and the EU must be transparent with customers about how personal information is used and what is held about them," said a spokeswoman for the UK's Information Commissioner's Office.

"This would include the way data is processed behind the scenes, how they are profiled and the way algorithms work to drive that processing."

Related internet links

The BBC is not responsible for the content of external sites.