You Consented to Facebook’s Social Experiment

Why Facebook was able to toy with users' emotions.

June 30, 2014— -- “Like” it or not, you consented to Facebook’s sociological experiment.

You’ve probably already heard about the controversial study that manipulated the Facebook feeds of nearly 700,000 users to study the social network’s emotional impact. But as far as the researchers were concerned, everyone agreed to participate by virtue of having a Facebook account.

The researchers from Facebook and Cornell University never actually read any posts, but they did alter the news feed algorithm for 689,003 users to skew the presence of positive or negative posts. They then tracked subsequent posts from those users by using positive or negative keywords.

“As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research,” read the study, which was published in the Proceedings of the National Academy of Sciences.

Facebook Facelift: Social Media Puts Pressure on People to Get Plastic Surgery

I Love You, You're Perfect, but Watch What You Facebook: Social Media Prenups

#YesAllWomen Campaign Gains Steam on Social Media

Here’s the line in the policy that Facebook says allowed it to manipulate the feeds without telling users before, during or after the study took place for a week in January of 2012:

“[I]n addition to helping people see and find things that you do and share, we may use the information we receive about you…for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

This is where the “ethical conundrum” begins, said bioethicist and lawyer Leslie Meltzer Henry, who works at the Johns Hopkins Berman Institute of Bioethics and the University of Maryland Carey School of Law.

“The type of one-click consent that Facebook users provide when they agree to the site’s data use policy is very different from the type of informed consent that’s ethically and legally required” of most biomedical or behavioral studies, said Meltzer Henry, adding that one-click-consent “is inadequate to cover the potential harm” that can come from participation in a study that involves emotional manipulation.

Institutions that receive federal funding are required to abide by a federal policy called the “Common Rule,” which protects human experiment subjects by ensuring that they know about the study and that they understand the risks involved. It also requires institutional review boards at universities and hospitals to approve the way subjects of biomedical or behavioral studies are treated.

Although Facebook is private, and therefore not subject to the Common Rule, two of the three researchers who designed the study worked at Cornell University, which is subject to it. And the journal where the study was published, called the Proceedings of the National Academy of Sciences, requires studies to go through institutional review boards, too.

Susan Fiske, the Princeton University psychology professor who edited the study for the journal said she was concerned about the ethics of the study as well, but she put her faith in Cornell’s and Facebook’s institutional review boards.

“According to the authors’ revision cover letter to me, the Cornell [institutional review board] classified the research as exempt because it was a ‘pre-existing dataset’ (presumably, Facebook’s anonymized data),” Fiske wrote in an email to ABC News.

“Having chaired an [institutional review board] for a decade and having written on human subjects research ethics, I judged that [Proceedings of the National Academy of Sciences] should not second-guess the relevant [institutional review board],” she wrote.

ABC News asked Fiske for a copy of the Cornell cover letter but received and out-of-office reply.

Calls to Cornell were not immediately returned.

Meltzer Henry isn't so sure the study should be exempt from informed consent.

“The problem here is that I don’t think this could possibly be a [pre-existing] dataset,” she said. "According to the [Proceedings of the National Academy of Sciences] article, all three authors were involved in designing the study and the news feed algorithm, which strongly suggests that the dataset did not exist before the research study, and therefore required informed consent."

Lead researcher and Facebook data scientist Adam Kramer took to Facebook to defend the study over the weekend.

“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook,” Kramer wrote.

He went on to explain that the “actual impact on people” was the minimal needed to conclude that Facebook feeds influenced users’ emotions. Though they expected happy news would make people feel sad, they found that people with a little more positive news in their feeds included more happy words in their posts.

“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone,” he wrote in the post. “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”