Have you been feeling sad lately? Does that mysteriously coincide with a sad newsfeed? It could be because you have unknowingly been involved in one of Facebook’s most recent research project; Experimental evidence of massive-scale emotional contagion through social networks. Simply put, Facebook conducted an experiment on 689,000 Facebook users which concluded that emotional states can be transferred to others via emotional contagion (rather than in person), leading people to experience the same emotions without their awareness. (Apparently we get sad when we see sad things on the internet. Surprise, surprise, we actually have hearts.)

305037.full

Conducted by researchers from both Facebook and Cornell University, this paper has given rise to an array of ethical questions and concerns, largely around the way the information was collected. Since this research was published in the Proceedings of the US National Academy of Sciences, PNAS’s editor-in-chief Inder M Verma published an Editorial Expression of Concern and Correction admitting that “questions have been raised about the principles of informed consent and opportunity to opt out” and that it’s ethically concerning, although he does state that the research was consistent with Facebook’s policies at the time.

As a basic overview of the method used, the research team used filtering software that allowed them to adjust user’s News Feeds, either reducing positive expressions or reducing negative expressions. Even though this may have been consistent with Facebook’s Data Use Policy at the time (those terms and conditions that you never bother reading), none of the participants were actively aware they were part of the experiment, let alone had the opportunity to opt out or consent.

So the question I have to ask is, if ethics in research is defined as widely accepted moral principles that ensure the researcher is doing the right thing, not only for the participants but also for the project and society as a whole, then how is it ethical to (unknown to the user) manipulate their newsfeed to alter the way they feel? Small scale, it may seem minute. See a sad puppy on your news feed= pang of sadness. But on larger scale, and I would definitely call 689,000 people large scale, we have to look at the ethical repercussions. In America (where this experiment took place) 6.7% of the population suffer from clinical depression. So when applying these statistics to this study we see that approximately 46,000 of their uninformed participants may have been suffering from mental illness, and had no chance to opt out of this experiment that could potentially be emotionally harmful to them. This, in my opinion, is not ethical nor moral in the slightest.

7giGT6W

References:

http://www.pnas.org.ezproxy.uow.edu.au/content/111/24/8788

http://www.pnas.org/content/111/29/10779.1.full.pdf

http://www.pnas.org.ezproxy.uow.edu.au/content/111/29/10779.1

http://www.nimh.nih.gov/health/statistics/index.shtml

http://www.huffingtonpost.com/jeremy-harris-lipschultz/social-media-research-eth_b_5933052.html

http://www.ncbi.nlm.nih.gov/pubmed/23679571

https://www.facebook.com/policy.php