To control a nation, you need to control its news. George Orwell knew it, Joseph Stalin knew it and now Facebook knows it.
The social network secretly carried out an experiment on hundreds of thousands of its users in 2012, attempting to manipulate their moods for a psychological experiment. It found that it was able to make users happy or sad by altering the amount of positive or negative content that appeared in their News Feed.
Outrage over the results played out, ironically, on social media when they were released. One of the researchers tried to justify the experiment, saying they wanted to investigate “the common worry that seeing friends post positive content leads to people feeling negative or left out.” And there was self-interest at stake too: Facebook were concerned that friends' negativity might lead people to leave the site.
There can be nothing ethical about an experiment that tries to make people unhappy without their consent, and without any real way of monitoring their mental health. Yet putting the ethics of the research to one side, consider how utterly terrifying it would be to live in a world where social media controls how you feel.
Well, you already do.
Facebook filters its News Feed ruthlessly. It says this is to stop you feeling overwhelmed by the thousands of updates you would otherwise receive. It claims to prioritise posts depending on what social media managers term 'engagement'. Essentially, you are more likely to see something that your friends have commented on, liked or shared. The popular stuff rises to the top of the News Feed and the less engaging content is silently buried. If your posts regularly get no likes or comments, then Facebook will stop showing them.
Lots of Facebook users don’t know this. The tiresome pedants who comment 'not news' under articles don’t realise they are unwittingly increasing its reach, allowing more and more people to see the item which has annoyed them. This is just one of the problems of a News Feed filtered in this way: engagement does not mean content is worthy. It merely means it provoked a response. Just look at Katie Hopkins.
6 Facebook privacy settings you should have checked by now
6 Facebook privacy settings you should have checked by now
1/6 Change who sees your posts.
Anything you post on Facebook - from a status update to a photo - can be given its own privacy setting. 'Public' means that the information can be found via Google, or you can create custom groups of friends (http://ind.pn/1bVJJ2H) to share info with. Remember: whatever setting you last choose will become default until you change it again.
2/6 Check what your friends are sharing about you.
Sometimes it's not you, but your friends that give information away. Follow this link to see the information that your friends might be sharing with third party apps - http://ind.pn/1bVVar6. Click the 'edit' option to the right of 'Apps other use' and un-tick every category of info you don't want to share. There's also an option above labelled 'Apps you use' that lets you select which apps can use your Facebook data elsewhere on the web. Don't trust them? Click the little cross on the right.
3/6 Hide old posts.
If you're keen to make your Facebook past more private, limiting who can see your old posts should be your first step. Follow this link - http://ind.pn/1bVK7hv - and click 'Limit The Audience for Old Posts on Your Timeline'. You can make all of these old photos and stats updates vieweable to the public, friends only, or just yourself. From this page you can also change who can send you messages and friend requests.
4/6 Create friend lists.
Since September 2011 Facebook has let you create different 'lists' of friends in order to let you separate what your close buddies and your work colleagues see. Facebook can give you a head start by suggesting lists based on who you went to school with and where people live - and you can even choose to browse a News Feed populated only by a certain list. Follow the link below for a full guide: http://ind.pn/1bVPu0d
5/6 Limit adverts.
Pages you like will sometimes be used by Facebook to endorse a product to your friends. If you don't wnat these to show up head to this page - http://ind.pn/1j6Mc2b - select "Pair my social actions with adverts for no one" and click Save Changes.
6/6 Check your profile.
If you're still worried about which of your photos or posts are visible to people you can check what the public (or any specific individual) sees when they click on your profile. View your profile by clicking on your namem then click the cog in the bottom right hand corner of your cover photo, then select 'View as...'
Once you know this, it’s not surprising that Facebook can alter the moods of their users. It’s a stream of social stimuli, filtered to favour the provocative. Like emotional junkies, we ping from outrage at perceived slights to tears over the video of a dog standing sentry at its owner’s grave. But this is not the real problem; only seeing the most engaging posts can be defended, after all, it’s crowdsourced, it’s democratic.
The real problem with their secret experiment is that it undermines the assertion that it is the most popular posts that are given priority, and not the posts that Facebook wants us to see. I run a Facebook page and there are constant algorithmic anomalies that can’t be explained. Sometimes it’s feels that someone is sitting in Facebook HQ arbitrarily turning a knob to decide how many people see my posts. I also harbour suspicions that the news Facebook lists in its trending module is heavily editorialised. It’s what Facebook wants to trend.
Facebook is the second most visited site in the world after Google. That’s a lot of power to have, and a lot of minds to own. It is time we acknowledge how influential these sites are and hold them to account. Before we are all typing in newspeak.