The scientific journal that published Facebook’s controversial social experiment has issued a statement expressing “concern” over the lack of informed consent from Facebook users who unknowingly participated.
The study by researchers from Facebook and Cornell University sparked outrage this week after word spread that the Facebook feeds of nearly 700,000 users had been secretly manipulated to study the social network’s emotional impact. The researchers claimed the Facebook users consented to the study by, well, being Facebook users.
According to statements from the journal and Cornell, Facebook did not need the type of informed consent typically required for studies because it’s a private company and is therefore held to different standards than universities and scientific journals.
“It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out,” the journal’s editor-in-chief, Inder Verma, said in the statement.
According to the study footnote, researchers from Cornell joined a researcher from Facebook to design the study, but sat out the controversial step of collecting and analyzing data from Facebook users. They then returned to write the paper.
“As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research,” read the study.
This is where the “ethical conundrum” begins, said bioethicist and lawyer Leslie Meltzer Henry, who works at the Johns Hopkins Berman Institute of Bioethics and the University of Maryland Carey School of Law.
“The type of one-click consent that Facebook users provide when they agree to the site’s data use policy is very different from the type of informed consent that’s ethically and legally required” of most biomedical or behavioral studies, said Meltzer Henry, adding that one-click-consent “is inadequate to cover the potential harm” that can come from participation in a study that involves emotional manipulation.
According to a Cornell statement, its researchers “analyzed results from previously conducted research by Facebook into emotional contagion among its users” and “did not participate in data collection and did not have access to user data.”
But Meltzer Henry isn't so sure the study should have been exempt from informed consent.
“The problem here is that I don’t think this could possibly be a [pre-existing] dataset,” she said before Cornell issued its statement. "According to the [Proceedings of the National Academy of Sciences] article, all three authors were involved in designing the study and the news feed algorithm, which strongly suggests that the dataset did not exist before the research study, and therefore required informed consent."
Lead researcher and Facebook data scientist Adam Kramer took to Facebook to defend the study last weekend.
“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook,” Kramer wrote.
He went on to explain that the “actual impact on people” was the minimal needed to conclude that Facebook feeds influenced users’ emotions. Though they expected happy news would make people feel sad, they found that people with a little more positive news in their feeds included more happy words in their posts.
“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone,” he wrote in the post. “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”