Advertisement
U.S. markets open in 3 hours 59 minutes
  • S&P Futures

    5,302.00
    -6.25 (-0.12%)
     
  • Dow Futures

    40,137.00
    -7.00 (-0.02%)
     
  • Nasdaq Futures

    18,467.00
    -36.75 (-0.20%)
     
  • Russell 2000 Futures

    2,133.30
    -5.10 (-0.24%)
     
  • Crude Oil

    81.61
    +0.26 (+0.32%)
     
  • Gold

    2,215.60
    +2.90 (+0.13%)
     
  • Silver

    24.53
    -0.22 (-0.88%)
     
  • EUR/USD

    1.0784
    -0.0046 (-0.42%)
     
  • 10-Yr Bond

    4.1960
    0.0000 (0.00%)
     
  • Vix

    12.98
    +0.20 (+1.57%)
     
  • GBP/USD

    1.2593
    -0.0045 (-0.36%)
     
  • USD/JPY

    151.4330
    +0.1870 (+0.12%)
     
  • Bitcoin USD

    70,668.01
    +838.95 (+1.20%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,961.82
    +29.84 (+0.38%)
     
  • Nikkei 225

    40,168.07
    -594.66 (-1.46%)
     

Here’s what you need to know about that Facebook experiment that manipulated your emotions

In case you missed it, a storm of controversy involving Facebook blew up over the weekend, after news emerged of a psychological study that was conducted on hundreds of thousands of users of the social network without their knowledge, in which researchers manipulated the emotional cues those users saw in their Facebook streams, and then tracked their subsequent behavior. Reactions to the news ranged from resignation or even acceptance to disgust at the company’s decision, and the unethical practice of altering the emotional status of thousands of users without permission.

What did the experiment involve?

As The Guardian explains, the study was done by a team of researchers that included a data scientist working for Facebook. It blocked specific kinds of emotional content from the news feeds of 689,003 people or about 0.04 percent of Facebook’s total user base, for a week in January of 2012. The study hid “a small percentage” of emotional words from peoples’ streams — without their knowledge — in order to see if doing so had any effect on the statuses they posted or the content they “liked” or shared.

The research was published in the June issue of a prominent scientific journal (the Proceedings of the National Academy of Sciences or PNAS) and was written up in New Scientist magazine. The journal’s editor said the data analysis was approved by a review board at Cornell university, but that the actual collection of the data was only approved by an internal Facebook review. There were early reports that the study was partially funded by the research office of the U.S. Army, but that appears not to be the case.

Does everyone who works at Facebook just have the "this is creepy as hell" part of their brain missing?—
sarah jeong (@sarahjeong) June 28, 2014

What do the results of the study show?

According to the researchers, including Facebook data scientist Adam Kramer, the experiment showed that users’ emotions were in fact reinforced, at least to some extent, by what they saw in their Facebook news feed, a phenomenon the research team called “emotional contagion.” They said this provides support for the idea that emotional states can be transferred to others without their awareness, and that this process can occur “without direct interaction between people — exposure to a friend expressing an emotion is sufficient” and without non-verbal cues:

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

Why is this controversial?

Scientists and other experts have criticized the fact that Facebook conducted the research, which appears to have successfully manipulated the emotions of users, without giving those users any information that they were being used in an experiment, and without giving them the ability to opt out. Law professor James Grimmelmann says that this lack of “informed consent” is a breach of the ethical standards that typically govern such research, since the study arguably harmed participants (even in a minor way) by altering their mood, and did so without their permission. Grimmelmann added that “this is bad, even for Facebook.”

Impressive achievement by Facebook to snatch back the title of most dystopian nightmarish tech company.—
Tom Gara (@tomgara) June 29, 2014

Sociologist Zeynep Tufekci of the University of North Carolina, who specializes in the effects of social media, wrote in a post on Medium — and in a related research paper that was just accepted for publication — that what is at stake isn’t just the status of a single piece of research involving the Facebook news feed, but the potential for much more invasive and disturbing uses of the data that users are providing to such networks.

“These large corporations (and governments and political campaigns) now have new tools and stealth methods to quietly model our personality, our vulnerabilities, identify our networks, and effectively nudge and shape our ideas, desires and dreams. That is one of the biggest shifts in power between people and big institutions, perhaps the biggest one yet of 21st century. We should care that this data is proprietary, with little access to it by the user, little knowledge of who gets to purchase, use and manipulate us with this kind of data.”

In his post, entitled “As Flies to Wanton Boys….” James Grimmelmann says that the argument that “Facebook already advertises, personalizes, and manipulates is at heart a claim that our moral expectations for Facebook are already so debased that they can sink no lower. I beg to differ.” British journalist Laurie Penny said in a post at The New Statesman that:

“Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent.”

What is Facebook’s defense?

In a nutshell, Facebook has argued that it was entitled to conduct the study because its usage policies include a line that refers to the data supplied by users potentially being used for research (Note: According to Kashmir Hill at Forbes magazine, Facebook didn’t add the line about research until after the emotional contagion study was completed). The editor of the scientific journal in which it appeared said that the review board believed it was justified “on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”

The FB furore ought to surprise no one. It manipulates its news algorithm all the time – part of a study or not : slate.com/articles/healt…
emily bell (@emilybell) June 29, 2014

Not everyone buys this explanation, however: Max Masnick, a researcher with a doctorate in epidemiology, says in a post of his own on Facebook that the structure of the experiment means there was no informed consent, something that is a crucial element of any study that involves research on humans. “As a researcher, you don’t get an ethical free pass because a user checked a box next to a link to a website’s terms of use,” he said.

A statement from a Facebook spokesperson said the research “was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process.”

What do the researchers say?

Kramer, the lead Facebook scientist involved in the research, posted a lengthy defence of the study on Facebook, saying it was done in part because “we care about the emotional impact of Facebook and the people that use our product.” He said that he and his colleagues believed it was important to investigate the theory that “seeing friends post positive content leads to people feeling negative or left out.” Kramer did say that he and his co-authors were sorry for any anxiety their paper may have caused, and admitted that “in hindsight, the research benefits of the paper may not have justified all of this anxiety.”

What do supporters of Facebook think?

Some argue that the research shouldn’t be that controversial, since Facebook manipulates the news feed of its users all the time — by tweaking the algorithms that highlight certain kinds of content, including trying to de-emphasize “low quality” content from viral-sharing mills and promote “high quality” content from news outlets. Others, including venture capitalist Marc Andreessen, say what Facebook did isn’t really that different from the kind of A/B testing that software companies and even media companies engage in all the time.

Run a web site, measure anything, make any changes based on measurements? Congratulations, you're running a psychology experiment!—
Marc Andreessen (@pmarca) June 28, 2014

Tal Yarkoni, a researcher in psychology at the University of Texas, noted in a post that the amount of fiddling that the Facebook research team engaged in was extremely small: “These effects, while highly statistically significant, are tiny. The largest effect size reported had a Cohen’s d of 0.02 — meaning that eliminating a substantial proportion of emotional content from a user’s feed had the monumental effect of shifting that user’s own emotional word use by two hundredths of a standard deviation. In other words, the manipulation had a negligible real-world impact on users’ behavior.”

Yarkoni also argues that just because some change in behavior was seen after the manipulations doesn’t necessarily mean that the emotional state of those users was actually altered: “The fact that users in the experimental conditions produced content with very slightly more positive or negative emotional content doesn’t mean that those users actually felt any differently. It’s entirely possible — and I would argue, even probable — that much of the effect was driven by changes in the expression of ideas or feelings that were already on users’ minds.”

@Asher_Wolf @mathewi I don't really accept the premise that words posted to Facebook equal someone's mood, is all I'm saying


Rusty Foster (@rustyk5) June 29, 2014

Michelle Meyer, a fellow at the Health Law Policy center at Harvard Law School, argues in a post that — contrary to what many academics have said — the Facebook study was not necessarily unethical, since it involved minimal harm if any to users, and that even if it had been reviewed by an academic research board before it was conducted that it might very well have been approved.

What should Facebook do?

In addition to possibly apologizing to users for not asking them for permission, Kashmir Hill — who writes about privacy for Forbes magazine — argues that the social network should have some kind of explicit opt-in process for such research, the way other services do:

“When I signed up for 23andMe — a genetic testing service — it asked if I was willing to be part of 23andWe, which would allow my genetic material to be part of research studies. I had to affirmatively check a box to say I was okay with that. I think Facebook should have something similar. While many users may already expect and be willing to have their behavior studied… they don’t expect that Facebook will actively manipulate their environment in order to see how they react. That’s a new level of experimentation, turning Facebook from a fishbowl into a petri dish.”

Paul Bernal, a lecturer in intellectual property and media law, suggests in a tongue-in-cheek post that Facebook should change its user policies to include language that requires users to agree that “by using Facebook, you consent to having your emotions and feelings manipulated, and those of all your friends (as defined by Facebook) and relatives, and those people that Facebook deems to be connected to you in any way. The feelings to be manipulated may include happiness, sadness, depression, fear, anger, hatred, lust and any other feelings that Facebook finds itself able to manipulate. Facebook confirms that it will only manipulate those emotions in order to benefit Facebook, its commercial or governmental partners and others.”

If Facebook does choose to apologize for its behavior, it will join a long list of apologies the social network has made during its relatively brief history, a list of which Mike Elgan has helpfully compiled.

Post and thumbnail images courtesy of Shutterstock / Dmitry Shiror

Image copyright Gigaom Illustration.

Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.



More From paidContent.org

Advertisement