The Facebook data scientist who served as lead author in a controversial study that set out to manipulate the emotional states of 689,003 Facebook users, sparking outrage and a federal complaint last week, lists his current city as San Francisco on his Facebook page. He also calls himself “Danger Muffin.”
Adam D.I. Kramer, who works for Facebook’s core data science team, conducted the research in partnership with two co-authors he publicly described as friends – Jeffrey Hancock, a communication and information science professor at Cornell University, and Jamie Guillory, a postdoctoral scholar previously at Cornell and now affiliated with the University of California San Francisco.
The trio’s research, conducted for one week in January of 2012, sought to determine whether Facebook users would be emotionally impacted by exposure to positive or negative content on their news feeds. They published their findings, edited by a psychology researcher from Princeton University, in the journal Proceedings of the National Academy of Sciences, with the title: “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks.”
“Emotional states can be transferred to others via emotional contagion,” the study notes, “leading people to experience the same emotions without their awareness.” A press release issued by Cornell in mid-June hailed the experiment as “the first to suggest that emotions expressed via online social networks influence the moods of others.”
People exposed to more negative content “used more negative words in their status updates,” Hancock explained in the Cornell press statement, while “significantly more positive words were used” by users who saw an increase in positive content.
In this 2011 video on internal experimentation using Facebook data, Kramer gives an in-depth presentation on how users’ total word sets – everything they’ve ever posted on Facebook – can be digitally analyzed with the use of a matrix that can ultimately show “how users differ from each other.”
The emotional contagion study has prompted a major backlash, prompting the Electronic Privacy Information Center to file a formal complaint with the Federal Trade Commission, accusing Facebook of engaging in deceptive trade practices. As EPIC put it, “the company purposefully messed with people’s minds.”
Julia Horwitz, EPIC’s consumer protection counsel, noted that Facebook signed a consent order at the direction of the FTC in 2012 following “a several-year investigation about other data sharing charges.”
As a result, “Facebook is now under this consent order that requires it to comply with various data protection provisions,” meant to safeguard the information that its users provide. “Facebook’s use of the information submitted into the data feeds, that was then processed through the psychological manipulation algorithm, is a violation of the consent order,” Horwitz explained.
To comply with the federal order, the company should have solicited express consent from users granting permission to be subjected to experimentation, she noted.
“One of the things we are hoping to gain from this complaint,” Horwitz added, “is to have Facebook publicize the news feed algorithms, so that users can understand the basis by which they’re given information.”
Jaron Lanier, author of Who Owns the Future?, railed against Facebook for its recklessness in experimenting on people’s emotional states in a New York Times OpEd, saying:
“The manipulation of emotion is no small thing. An estimated 60 percent of suicides are preceded by a mood disorder. Even mild depression has been shown to increase the risk of heart failure by 5 percent; moderate to severe depression increases it by 40 percent.
Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. … The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care.”
While Facebook seems to be bearing the brunt of public outrage over the study, the social media giant’s partnership with academic sector has also raised questions. Guillory became affiliated with UCSF only after her involvement with the study, but in the angry aftermath of the publication of this experiment, Cornell has sought to distance its researchers from the controversy.
Jamie Guillory, formerly at Cornell and now at the University of California San Francisco, was a co-author of the study.
In an official statement, Cornell noted, “Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. … Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.”
Syl Kacapyr, a Cornell spokesperson, forwarded the canned statements to the Bay Guardian and said none of the study’s authors would be granting media interviews. Nevertheless, we reached out to Kramer and Guillory individually to request interviews. If we hear back, we’ll update this post.
Kramer, a.k.a. Danger Muffin, did publicly address the study on his Facebook page.
“The reason we did this research,” he wrote, “is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
He went on to note that the research “very minimally” deprioritized News Feed content, adding that “we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses [it].
“The goal of all of our research at Facebook is to learn how to provide a better service,” Kramer concluded. “Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”