Updated at 5:45 p.m. PST to show Professor Michelle N. Meyer agreed with Cornell that their IRB was not required by OHRP policy to review the study.
Cornell University claims it’s not “it” in the controversy over Facebook’s mood-manipulation experiments.
Today, the Ivy League school tried to distance itself from the ordeal, saying in a statement that the university didn’t review the mood manipulation experiment that Facebook is receiving heat over.
“Because the research was conducted independently by Facebook and Professor Hancock had access only to results— and not to any data at any time—Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required,” the statement said. Cornell’s Institutional Review Board approves, monitors, and reviews “human subjects research.”
The media quickly jumped to scrutinize the research following initial reports over the weekend. It stands out because it’s one of the few such studies on Facebook to receive any criticism (of course, that’s what happens when you set out to intentionally make people sad on your social network).
In 2012, for instance, Facebook published a paper after running “an experiment involving 61 million users of the social network.” In the research, the company sent out “a single election-day Facebook message” and encouraged about 340,000 more people to vote in the 2010 U.S. Congressional elections.
Meanwhile, as VentureBeat reported earlier this month, Facebook is planning to hold an academics-only conference in advance of the American Sociological Association 2014 annual meeting this August in San Francisco. One of the event organizers, University of Chicago sociologist Michael Corey, has a growth research team focused on expanding Facebook’s reach into developing countries and helping it sell more ads.
Facebook data scientist Adam Kramer, an author of the paper about the mood-manipulation research — along with Cornell professor Jeff Hancock and former Cornell doctoral student Jamie Guillory — released a statement on his own Facebook page yesterday indicating that internal review practices were in place at Facebook.
Now Cornell is attempting to minimize its role in the tussle.
However, Cornell’s claim “the research was conducted independently by Facebook and Professor Hancock had access only to results” is questionable.
“I disagree with the Cornell IRB’s statement that its affiliates analyzed results ‘from previously conducted research by Facebook,’” Michelle N. Meyer, an assistant professor and director of bioethics policy at Union Graduate College, wrote in an email to VentureBeat.
“They participated in ‘initial discussions,’ as the IRB put it, and helped design the study, as the PNAS paper put it, that helped shape the new dataset that emerged. That should be viewed as part of research, not something separate that preceded it.”
But according to Professor Meyer, “the Cornell affilliates’ particular contributions to the study did not ‘engage’ Cornell, the institution, in research, as that term has been interpreted in non-binding guidance by OHRP—an interpretation that Cornell’s IRB had previously adopted as institutional policy—and, hence, that IRB review was not required at Cornell.”
And there still could be issues even if the Cornell IRB was right.
“If the researchers from Cornell were involved in designing the study, even if they did not personally collect that data, they are partly responsible for the way the data were collected. This involved deception without debriefing,” Laura Nelson, a Ph.D. candidate in sociology at the University of California, Berkeley, wrote in an email to VentureBeat.
VentureBeat already reached out to Cornell about Prof. Hancock’s role in design of the research and whether it constitutes “data collection.” But the university had “no additional information to share at this time beyond what is contained in the statement,” a spokesman wrote in an email to VentureBeat.
VentureBeat is studying mobile marketing automation
, and we’ll share the data.