U.S. Markets closed

Backlash at Facebook over mood study

Hot Stock Minute

Think those status updates organically appear on your Facebook (FB) news feed?

Think again.

The company is facing backlash after the revelation it was selective in which updates appeared in certain users' news feeds to see how they would react. Back in January 2012, the company manipulated 689,003 users' news feeds for an academic study to “test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed.” Updates in the news feed showed either more positive posts or negative posts to see what kind of updates the users wrote.

The users, who were chosen at random, did not give their consent to participate in this particular study and weren't made aware of the experiment. Facebook said users give consent to have their data used in analysis and research when they agree to the site's terms of service. The company also said it conducts research to give users a better experience.

In response to the uproar, the lead Facebook researcher in the experiment, Adam D.I. Kramer, explained the motivation for the experiment in - what else? - a Facebook post. He also said, “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.”

“In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he said.

The results of the study, by the way, found that users wrote more positive updates after seeing positive ones, and wrote more negative updates after seeing negative ones.

Yahoo Finance Senior Columnist Michael Santoli said that this study is a message to Facebook users that their news feed isn’t a pure, un-adulterated flow of what comes through from friends and advertisers. “Their entire news feed is an engineered thing," he said.

However, Santoli doesn’t think this uproar will change anything for Facebook. In the past when it appeared the company made a major misstep by changing settings or disclosures for users, the controversy went away. “It’ll go away this time,” he said.

We want to know what you think? Will you leave Facebook after hearing about the company's emotion study? Vote in our poll, or leave us a comment below or on Twitter.