According to a Cornell statement, its researchers “analyzed results from previously conducted research by Facebook into emotional contagion among its users” and “did not participate in data collection and did not have access to user data.”
But Meltzer Henry isn't so sure the study should have been exempt from informed consent.
“The problem here is that I don’t think this could possibly be a [pre-existing] dataset,” she said before Cornell issued its statement. "According to the [Proceedings of the National Academy of Sciences] article, all three authors were involved in designing the study and the news feed algorithm, which strongly suggests that the dataset did not exist before the research study, and therefore required informed consent."
Lead researcher and Facebook data scientist Adam Kramer took to Facebook to defend the study last weekend.
“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook,” Kramer wrote.
He went on to explain that the “actual impact on people” was the minimal needed to conclude that Facebook feeds influenced users’ emotions. Though they expected happy news would make people feel sad, they found that people with a little more positive news in their feeds included more happy words in their posts.
“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone,” he wrote in the post. “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”