Facebook continues to give users reasons to dislike the tech giant. Over the weekend news broke that the social media site had manipulated the news feeds of nearly 700,000 users without their knowledge.
It undertook an experiment in 2012 to test the notion that users feel bad when they see lots of positive news from their "friends." The experiment involved reducing the number of positive news feeds for some and reducing the number of negative news feed for others. The study found that the more positive the news feeds a user received, the more positive their postings became, and vice versa.
Related: Facebook: What’s not to like?
The results were published in the latest issue of the Proceedings of the National Academy of Sciences of the United States of America. Then on Sunday the experiment's lead researcher Adam Kramer posted an apology for the experiment on his Facebook page, publicizing the news:
"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."
Related: Facebook's next acquisition: Drones?
He added that Facebook, "has been working on improving our internal review practices" and "will incorporate what we’ve learned from the reaction to this paper" in that review.
From the comments that are circulating on the Internet, Facebook may want to do that sooner than later.
"If Facebook was willing to allow this experiment ... what else might it allow in the future?" asks Alex Wilhem at TechCrunch.
"This research may tell us something about online behavior, but it's undoubtedly more useful for, and more revealing of, Facebook's own practices," Jacob Silverman, author of Terms of Service: Social Media, Surveillance, and the Price of Constant Connection, told Wired magazine.
Yahoo Finance senior columnist Rick Newman says, "There's nothing new about Facebook manipulating its users... Some people are overreacting but this was an interesting case study in how they should disclose things to users a little differently. That might be [a] lesson learned for Facebook."
He suggests in the video above that Facebook broadcast its experiments beforehand. "If Facebook had said ahead of time, 'By the way, we’re going to conduct an experiment; it will involve about this many people... randomly selected... no one will be targets... and we'll tell you what we found,' I think there would be no problem there and that's probably what's going to happen in the future."
Follow The Daily Ticker on Facebook and Twitter (@dailyticker).
More from The Daily Ticker