Special Report: Social Media Study Faces Storm
Firestorm erupts over Facebook study’s social media manipulation
Should an IRB have reviewed the study?
When the Proceedings of the National Academy of Sciences (PNAS) published a study about social contagion on Facebook in June 2014, reaction was fast and furious. By the end of July, one could search online for "criticism of Facebook’s social contagion study" and produce 6.3 million links.
Some scientists, IRBs members, ethicists, and others blasted Facebook, PNAS, and Cornell University for what they characterized as the study’s deception, lack of IRB review and informed consent, and its methodology. Some called for the article to be retracted by PNAS. Meetings and discussions were held.
PNAS published an "Editorial Expression of Concern," stating that the paper represents "an important and emerging area of social science research that needs to be approached with sensitivity and with vigilance regarding personal privacy issues."1
Still, questions from the human subjects protection perspective remained unanswered: Did the study violate human subjects protection regulations and, specifically, the requirement for informed consent? And was the study unethical? (See story about the Facebook study and the Common Rule, page 99.)
A reaction to what some called "vitriolic criticism" arrived by July. Ethicist Michelle N. Meyer, supported by 33 professors and researchers, wrote in Nature that the strong reaction some have had to the study could have a chilling effect on valuable research and perpetuate the myth that research is dangerous.2
"I’ve seen someone say that this is the information equivalent of Tuskegee, and I do worry about making more of the issue than is necessary," says Eric Meslin, PhD, director of the Indiana University Center for Bioethics in Indianapolis. Meslin was one of the signatories on Meyer’s paper.
"I would imagine that virtually every social platform is doing something very similar," Meslin says. "We should be careful about poking a finger into Facebook’s eye because it could drive research underground and make it more secretive."
Since Facebook is a private entity that receives no federal research funding and does not plan to apply for a new drug, biologic, or medical device application, it’s not required to follow federal regulations regarding human subjects research. However, the fact that two of the study’s researchers are from Cornell University, which is subject to federal rules, creates a gray area with regard to the whether the study should have had a full IRB review and used informed consent, experts say.
The Facebook study’s finding was that emotional states expressed in social media can be transferred to others via emotional contagion. Facebook manipulated news feeds during one week in 2012 for a random group of English-speaking users, showing some users only negative emotional content and some users positive emotional content. It did not affect direct messages sent between users. Then investigators used Linguistic Inquiry and Word Count software, a word counting system, to identify correlations between news feeds and positive or negative words written by the viewing user.3
The study concluded that people who had a greater proportion of negative content used fewer positive words, and people with more positive content used more positive words.3
Facebook researcher Adam D. I. Kramer posted a response to questions about the study on his Facebook page.
"The goal of all of our research at Facebook is to learn how to provide a better service," Kramer wrote. "Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone."
While some people believe Facebook’s study represented a gross violation, others believe its manipulation of user’s news feeds was relatively minor and probably something the company has been doing all along as a non-research activity, says John R. Baumann, PhD, assistant vice president for research compliance at Indiana University.
"They probably look at patterns where negative comments lead to more negative comments and what sorts of ads get clicked after comments," Baumann says. "Facebook and advertisers place ads strategically on the Internet."
The average social media user might not be aware of how much data Facebook is collecting at any given moment, says Elizabeth A. Buchanan, PhD, endowed chair in ethics and director of the center for applied ethics at the University of Wisconsin-Stout in Menomonie. Buchanan also is the associate editor of the Journal of Empirical Research on Human Research Ethics.
"That’s part of their platform to get research and understand users, so I’m surprised that we’re surprised in a way," she adds.
The controversy over the Facebook study is the first time the research community has had a broad discussion about social media, informed consent, and research ethics, she notes.
"This is the new normal: seeing that collision and collapse of boundaries in research, and that’s what is so interesting about this particular case," Buchanan says.
"While we have the age-old ethics issues of consent and privacy, this represents a new way of research and new way of thinking about our participation in research," she explains. "Much of it involves an unintentional, nonconsensual environment in which we are operating."
Internet social media, as a new medium, has already raised many questions about what constitutes informed consent, notes Miguel Roig, PhD, professor of psychology at St. John’s University in Staten Island, NY.
"I don’t know if you could have done this study with traditional informed consent, but there are certain things a researcher can do," Roig says. "If the need for the study is such that it’s inappropriate to get informed consent, then what you do is a debriefing with subjects."
"What are people’s expectations of privacy as these things evolve?" Roig says. "This case will sensitize IRBs to these kinds of situations and more specifically to situations where researchers submit protocols in which they only analyze data."
Roig refers to how the study’s three authors included Kramer, who is a member of Facebook’s core data science team, and two investigators from Cornell University. The Cornell researchers did not participate in data collection, according to Cornell University’s media statement, published June 30, 2014.
"Professor [Jeffrey] Hancock and Dr. [Jamie] Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper," according to a media statement from Cornell published June 30, 2014.
One lesson might be that when research institutions are approached to work with existing data they might ask more questions about how these data were collected, Roig says.
"We don’t want to be involved with data collected unethically," he adds. "I think if anything, this study will re-sensitize IRBs to similar situations."
Current human subjects research regulations were created 40 years ago — long before electronic social media existed, and their age is showing, says John H. Noble, Jr., PhD, MA, MSW, a retired researcher who attended meetings of the Belmont Report group and has been a registered member of the IRB Forum for over four decades.
"We’re at the cusp of where to go as a system that has been developed and worked out on a 1970’s model," Noble says. "It’s showing signs of wear and unwieldiness — even in traditional research."
- Verma IM. Editorial expression of concern: experimental evidence of massive scale emotional contagion through social networks. PNAS. 2014; 111(24):published online: http://www.pnas.org/content/111/29/10779.1.short.
- Meyer MN, Lantos J, London AJ, et al. Misjudgements will drive social trials underground. Nature. 2014;511:265.
- Kramer AD, Guillory JE, Hancock JT. Experimental evidence of massive-scale emotional contagion through social networks. PNAS. 2014;111(24):8788-8790.