Facebook angered many people last week when news broke that the social media giant performed an extensive mood manipulation study on its News Feed two years ago — without consent from 700,000 users. Now, it looks like Facebook didn't have approval from anyone: A Cornell University ethics board didn't approve the Facebook mood study until after it was conducted, leading many to question the ethical grounds of the research.
The social media company also allegedly lied about having "implied" permission from its users, as the platform's user agreement contract was different at the time of the study.
Conducted in January 2012, the mood manipulation study tweaked the Facebook news feeds of roughly 700,000 users over a one-week period. The researchers, including Facebook data scientist Adam Kramer, Cornell professor Jeffrey Hancock and Cornell doctoral student Jamie Guillory, manipulated the news feeds with either positive or negative posts to see how it would affect Facebook users.
According to The Washington Post, the Cornell University’s Institutional Review Board — an independent ethics board — initially said it approved the Facebook mood study. However, the Cornell ethics board released a statement on Monday with a slightly different take:
Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
But that's not all that was revealed this week. Facebook claimed users gave their consent through the Data Use Policy, which states the company may use information from its user for "data analysis, testing and research." However, Forbes pointed out that this policy hasn't always been in place. In fact, the policy was instated in May 2012 — four months after the mood manipulation study occurred.
According to Forbes, the new Data Use Policy differed greatly from the previous policy introduced in September 2011; that policy never stated that user information will be used for research.
On June 29, Adam Kramer, the Facebook researcher who designed the experiment, defended his study and the company's position in a post on his Facebook page. However, he also issued an apology to Facebook users who were distressed over the unknown manipulation of their news feeds:
The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.
Kramer added that research policies at Facebook have changed since January 2012, and the social media company is improving their review practices. "Those review practices will also incorporate what we’ve learned from the reaction to this paper," Kramer said.