How Facebook decides what appears on your news feed

Are the site’s algorithms really as neutral as they're supposed to be?

Andrew Griffin
Thursday 12 May 2016 10:27 BST
‘I was raised Jewish and then I went through a period where I questioned things, but now I believe religion is very important,’ the Facebook founder said recently
‘I was raised Jewish and then I went through a period where I questioned things, but now I believe religion is very important,’ the Facebook founder said recently

It seemed like at once a confirmation of everything you knew and a terrifying confirmation that you could trust nothing: not everything you see on Facebook arrived there because it was true, or good.

This week, a report alleged that the humans who worked on Facebook’s “Trending Topics” feature had been instructed to bury news from certain conservative outlets and promote “worthy stories” even if people weren’t talking about them. Facebook denied the reports and said there was no evidence that such behaviour was happening – but some people’s most terrifying worry had already come true, whether or not it was actually false.

But perhaps the bigger misconception was ever believing that there was an unmediated version of Facebook that doesn’t have any biases. Some people appeared to believe that humans intervening would inevitably bring bias, and that those decisions should instead be made by uninterested humans – but algorithms can be just as biased, largely because they are made by people.

Facebook’s algorithms are a huge and somewhat mysterious machine that rule our entire lives. The news this week should be a reminder that they are shaping the way we see the world in never before seen ways.

But it still isn’t really clear how to actually get seen on the service – if it were too clear, Facebook would probably change the algorithm so it wasn’t. The site’s algorithm does seem, or has seemed in the past, to give priority to personal events over news of any kind. In 2014, for instance, BuzzFeed reported that one user had decided to pretend that they had got married – and then used that that to push up posts about the Ferguson protests. Since marriages are prioritised by Facebook’s algorithm, that post was reported to have gone to the top of the feed and then stayed there.

“I noticed that while my posts usually get a billion likes and comments on education stuff, for some reason, my feed is relatively silent about Ferguson Missouri,” the post read. “I have a suspicion that Facebook is hiding these posts from feeds, pretty much in line with how Facebook operates.”

There’s no reason to think that those posts were actually being actively or knowingly hidden by Facebook or people working for it. Instead, they are a consequence of the inevitable bias of an algorithm. Promoting personal life events might seem a good thing, in itself, since they are most likely to be of interest to a person’s friends; but accidentally doing that too much might lead to the news feed leaving out important events in the world.

Online polls that got out of hand

Facebook’s algorithms nowadays also promote the kinds of posts that it wants to see on its network. So at the moment many organisations are running Facebook Lives – streaming broadcasts of videos that represent an area that Facebook wants to break into, and not by coincidence tend to appear at the top of people’s news feeds.

Any quirk with Facebook’s algorithms – even the tiniest one – will become huge. The site is now the biggest distributor of media in history, and what people see there has a unique effect on the way the world works.

The really important decisions that Facebook makes are the smallest tweaks. It regularly alters its algorithms to privilege certain things – looking out for links that people tend not to spend any time on once they’ve clicked, for instance. That change – which was intended to privilege quality – at the same time cut off an entire industry of sites that lured people in with headlines that didn’t actually match what they saw.

'Facebook is continually testing new things, in secret or in the open, for getting itself to become more of a platform for breaking news' 

In one sense, Facebook’s problems with breaking and important news are an inevitable part of having an algorithm based on engagement. (Twitter, for its troubles, is usually held up as the example of a good social network for breaking news – it has an algorithmic timeline but it can be turned off.) Algorithms like those used by Facebook watch in part for posts that are getting particular traction among their users. That is measured in part by things like how many people are liking, sharing and reading an article.

But that process will always be slow. It requires a certain amount of people to start giving an article a push – at which point it might start being spread around, but only after some time. But Facebook is aware of this – it is continually testing new things, in secret or in the open, for getting itself to become more of a platform for breaking news and everything else in the world. Trending Topics, which set off this week’s controversy, was an attempt to do that: gathering news stories before they took off and highlighting posts about them.

Ultimately, Facebook’s likely aims are simple: ensure that people keep using and engaging with the site. It makes money from gathering data about people and then showing them ads based on that data. None of that means that the huge amounts of scepticism that regularly meet its changes and announcements aren't justified. It just means they might be much smaller, and less secret, than we'd realised.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies


Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in