Facebook manipulated posts on 689,000 profiles to change people's moods
Facebook has revealed it messed with people’s emotions in a top secret experiment to see how social networks influence our moods.
The site manipulated information on 689,003 Facebook news feeds to see if they could make people feel happier or sadder through a process called “emotional contagion”.
Working with academics from Cornell and the University of California Facebook used an automated system to control the flow of comments, pictures, videos and links in an attempt to alter people’s moods.
In one test Facebook users were shown less “positive emotional content” in their news feeds causing their updates to be similarly sad. Another test did the opposite, with less exposure to “negative” updates resulting in people posting more positive things.
Researches heralded the results as the first proof that emotions expressed by friends on social networks can “influence our own moods”.
But critics and privacy advocates have claimed the experiment breached ethical guidelines by not giving people “informed consent” under US law. Facebook carried out the experiment on a random selection of people using the site in English.
British MP Jim Sheridan, a member of the Commons media select committee, said people needed “protection” from Facebook:
“This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people.
“They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics or other areas.”
Facebook has said that its own terms and conditions allow it to conduct such experiments and at no point were researchers able to see what people were posting and that the experiment used automated software.
Despite Facebook’s claims some researches have said the site needed to get direct consent to play with people’s minds in this way.