Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facebook is secretly ranking how trustworthy people are

Users are being judged on their behaviour – without knowing it, or how they are being ranked

Andrew Griffin
Tuesday 21 August 2018 15:33 BST
Comments
Protesters from the pressure group Avaaz demonstrate against Facebook outside Portcullis House in Westminster
Protesters from the pressure group Avaaz demonstrate against Facebook outside Portcullis House in Westminster (Reuters)

Facebook is secretly ranking people according to how untrustworthy they are, it has revealed.

The site is using an array of information to decide whether its users should be believed when they say something wrong is happening on the site.

Users will not know that their behaviour is being ranked, the company appeared to suggest. But the ranking themselves are important: they decide whether an account is reliable enough to be believed when it reports a page or a story, helping decide whether it should be taken down straight away.

Facebook has been looking to improve how it reacts to problem accounts, and pages that publish fake stories or otherwise break its rules. But it still largely requires its users to notify it of those problems, and then decide whether to take the accounts down.

It has run into problems because users will often report pages they disagree with or otherwise want taken down that might not actually have broken its rules. That is why it is ranking them to decide whether they should be listened to, according to a report in the Washington Post.

The company said that it could not reveal details of the tool for fear that those with understanding of its workings would try and game the system. But it is running in the background and ranking users from zero to one, reports claimed.

Facebook did not reveal what markers it looks for that suggest people are untrustworthy. But it revealed that accurately reporting content might give people more clout, for instance.

“One of the signals we use is how people interact with articles,” Lyons told the Washington Post. “For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in