Benefits Of Facebook Experiment 'May Be Huge'
Updated: 10:24am UK, Monday 30 June 2014
By Tom Cheshire, Technology Correspondent
Facebook has been manipulating your emotions for years.
It's why you like a friend's post, upload an image or update your status - every action is designed to make you engage with the site.
And it's why so many people use the platform.
This time, though, it has manipulated emotions in an unusually transparent way; it was for scientific benefit, yet it caused an uproar. Why?
The study - a collaboration between Facebook, Cornell University and the University of California at San Francisco - affected 689,003 users without their knowledge.
According to the study's abstract the experiment "manipulated the extent to which people were exposed to emotional expressions in their news feed".
It went on to talk of "massive scale emotional contagion".
This sort of language led MP Jim Sheridan to voice concerns about people "being thought-controlled".
It sounds very ominous.
However, Facebook has been manipulating emotions for a while. So does pretty much every web service.
Facebook is constantly adjusting the algorithm behind its news feed so that people spend more time on the platform.
One of the co-authors of the study, Adam Kramer, posted to Facebook to explain the company's own interest in carrying out the research.
He said: "We were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
In the same way, YouTube offers different suggested videos to different people, to get them to watch more videos.
Amazon manipulates its store design – offering different deals and layouts - so that people spend more money.
Emotional manipulation has been important to every mass medium, from the novel to newspapers, radio, TV and film. It's why we keep reading and watching.
The difference is now that it's easy to measure that emotional engagement quite precisely: you can track every aspect of a person's behaviour online, compared to the less informative metric of number of books sold, or the estimated viewers of a television programme.
If you operate a web site that a lot of people use and make changes based on how they behave, you're already conducting a kind of psychological experiment.
Still, although the experiment was probably legal, people are rightly uncomfortable at the idea that their emotional responses were secretly manipulated.
Even if they agreed (or even read) Facebook's 9,000 word data use policy, they didn't give their informed consent to the experiment. Nor did Facebook explain exactly why they were interested.
But we shouldn't be worried about this particular study in emotional manipulation: it was relatively (and eventually) transparent, and it perhaps helped our scientific understanding.
The terms used aren't particularly helpful: "emotional contagion" sounds like harmful mind control, but it's a well-defined term in social psychology.
And it may be on a "massive scale", but that's because Facebook's billion-person reach is massive.
But we should be worried about the constant, less accountable experiments all web giants are making.
Many aspects of our lives are affected by these black box algorithms – we actually have no real idea how they work, and how they affect us.
Mr Kramer also wrote: "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
He's wrong. If his paper makes us consider our relationships with these ubiquitous platforms, the benefits will be huge.