Babbage | Facebook

Emotional issues

Facebook's secret manipulation of users' emotions sparks controversy

By M.G. | SAN FRANCISCO

YOUR correspondent just took a look at his News Feed on Facebook, where the social network displays news, videos and pictures from users and their friends. There were some beautiful photos posted by pals vacationing in Moscow; wedding news from a friend who had just got hitched in Switzerland; a heartwarming story about a friend’s grandmother’s 70th birthday; and a tale from a fellow Economist journalist in Asia who was having fun listening to a cab driver sing along to a mini karaoke machine he’d installed in his cab. The only negative note was sounded by a friend bemoaning Mexico’s defeat in its World Cup soccer match against Holland.

Normally, he wouldn’t think twice about why he was seeing this particular mix of posts, which was largely full of upbeat news. But reports that Facebook has in the past manipulated what some users have seen in their News Feeds in order to test whether it can influence their emotions has him wondering about whether he is being deliberately toyed with.

The furore has blown up around a paper recently published in the Proceedings of the National Academy of Sciences, which can be found online here. It reports on an experiment in which Facebook altered the news-feed content that nearly 690,000 users saw during one week in January 2012. Researchers sought to learn whether Facebook feeds influenced the emotional state of users. The test's subjects were chosen at random but were not notified.

During the week in question all of the News Feed posts were genuine, but some in the sample group saw posts that were largely positive, while others saw posts that were largely negative. Facebook software simply influenced what appeared. The experiment, which was conducted by a Facebook data scientist and two outside researchers, covered some 3m posts containing 122m words.

After the week was up and the deliberate manipulation of feeds ended, the researchers found that those who had seen a preponderance of upbeat news over the period were more likely to produce upbeat posts of their own in the days that followed. Those who had been exposed to more negative news tended to produce posts that were more downbeat in tone. According to the researchers, this showed “experimental evidence for massive-scale contagion via social networks”.

The news that friends can influence how we feel is hardly earth-shattering. Nor has Facebook made any secret of the fact that it manipulates what we see in News Feed, including the videos that are served up. Still, the news that the social network is willing to support broad-based research that deliberately manipulates people’s emotions is disturbing.

First, it raises questions about just how far Mark Zuckerberg, Facebook’s boss, and his colleagues are willing to let such experiments go. The fact that Facebook now has well over a billion users puts it in a powerful position to influence vast numbers of people. The research paper notes that the experiment only had a modest influence on user behaviour. But it also points out that if had been applied to the entire universe of Facebook users at the start of 2013, the experiment could have influenced “hundreds of thousands of emotion expressions in updates per day”.

Another reason this experiment crossed the creepy line is that it was conducted without the knowledge of users whose feeds were being manipulated. When people sign up to Facebook, they agree to terms of service that specifically allow the company to conduct research using their data without seeking their permission. But most people would probably object if it specifically made clear that part of that research would be aimed at manipulating the way they feel. Facebook could have advertised for users willing to take part in this kind of research, but it chose not to.

Even the Facebook researcher involved has conceded that the fallout from the research has been very damaging. In a post on Facebook about the experiment, Adam Kramer says:

I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefit of the paper may not have justified all this anxiety.

Note that Mr Kramer doesn't admit that the entire approach to the research was wrong-headed. He only says that the way the paper describes it was a mistake, which is rather startling given the backlash that has occurred. He goes on to say in his post that Facebook has since been “working on improving” its internal review practices and that it has “come a long way” since the experiment was conducted. But he fails to spell out what Facebook’s current policy is and how it would affect a decision to conduct a similar experiment today. Until the social network clarifies its stance, emotions on this issue are likely to run high.

(Photo credit: JOSH EDELSON / AFP)

More from Babbage

And it’s goodnight from us

Why 10, not 9, is better than 8

For Microsoft, Windows 10 is both the end of the line and a new beginning


Future, imperfect and tense

Deadlines in the future are more likely to be met if they are linked to the mind's slippery notions of the present