Facebook manipulated users' moods in secret experiment
Facebook can use algorithm to make users happy or sad and even scientist that edited the study said she was 'creeped out' by it
Sunday 29 June 2014
Facebook manipulated the emotions of hundreds of thousands of its users, and found that they would pass on happy or sad emotions, it has said. The experiment, for which researchers did not gain specific consent, has provoked criticism from users with privacy and ethical concerns.
For one week in 2012, Facebook skewed nearly 700,000 users’ news feeds to either be happier or sadder than normal. The experiment found that after the experiment was over users tended to post positive or negative comments according to the skew that was given to their news feed.
Read more: Facebook responds to users' outrage
The research has provoked distress because of the manipulation involved.
Studies of real world networks show that what the researchers call ‘emotional contagion’ can be transferred through networks. But researchers say that the study is the first evidence that the effect can happen without direct interaction or nonverbal clues.
Anyone who used the English version of Facebook automatically qualified for the experiment, the results of which were published earlier this month. Researchers analysed the words used in posts to automatically decide whether they were likely to be positive or negative, and shifted them up or down according to which group users fell into.
It found that emotions spread across the network, and that friends tended to respond more to negative posts. Users who were exposed to more emotional posts of either type tended to withdraw from posting themselves.
There are on average 1,500 possible stories that could show up on users’ news feeds. Source: Getty Images
The research drew criticism from campaigners over the weekend, who said that the research could be used by Facebook to encourage users to post more and by other agencies such as governments to manipulate the feelings of users in certain countries.
Even the scientist that edited the study had ethical concerns about its methods, she said. "I think it's an open question," Susan Fiske, professor of psychology at Princeton University, told the Atlantic. "It's ethically okay from the regulations perspective, but ethics are kind of social decisions. There's not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn't have been done... I'm still thinking about it and I'm a little creeped out too."
Facebook’s ‘Data Use Policy’ — part of the Terms Of Service that every user signs up to when creating an account — reserves the right for Facebook to use information “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
The researchers said that constituted the informed consent required to conduct the research and made it legal. The study does not say that users were told of their participation in the experiment, which researchers said was conducted by computers so that they saw no posts.
Facebook has said that there are on average 1,500 possible stories that could show up on users’ news feeds at any one time. It uses an algorithm that it says analyses users’ behaviour on the site to determine which of those stories to show.
Life & Style blogs
Alexander McQueen at auction: What makes a really great piece of fashion?
A bottle of wine a day is not bad for you and abstaining is worse than drinking, scientist claims
No female ejaculation, please, we’re British: a history of porn and censorship
Stressed nurses are 'forced to choose between health of patients and their own'
Pornhub: Kim Kardashian's sex tape is the most-watched porn video of all-time
Disgruntled RBS worker writes hilarious open letter to Russell Brand after anti-capitalist publicity stunt leaves him hungry
Shock poll shows voters believe Ukip is to the left of the Tories
Nigel Farage's approval rating hits 'record low' as popularity suffers in wake of Ukip sex scandal
Nigel Farage defends Kerry Smith 'ch***y' comment: 'If you are going for a Chinese, what do you say you’re going for?'
Ukip candidate jokes about 'shooting peasants' in racist and homophobic rant
Pakistan school attack live: Taliban kill at least 132 children in 'horrifying' massacre
- 1 Nigel Farage: Me vs Russell Brand on Question Time – he's got the chest hair but where are his ideas?
- 2 Harry Potter fans can apply to the Hogwarts-inspired College of Wizardry
- 3 Jessica Chambers: 19-year-old woman 'doused with lighter fluid and burned alive' in the US
- 4 Russell Brand calls Nigel Farage 'poundshop Enoch Powell' in BBC Question Time debate
- 5 Orange Wednesdays are no more
iJobs Gadgets & Tech
£50000 per annum + 26 days holiday,pension: Ashdown Group: A highly successful...
£30 per hour: Ashdown Group: An industry leading and well established business...
£20000 per annum: Ashdown Group: A highly reputable business is looking to rec...
£28000 per annum: Ashdown Group: A highly reputable business is looking to rec...