In a controversial study, Facebook reported the results of psychological experiment it conducted on nearly 700,000 of its users last week, sparking outrage from its users.
Facebook found that it can influence users’ feelings positively or negatively by manipulating what shows up in your newsfeed. The results indicate “emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
Some think the controversy over the study points to a broader problem in the data/analytical industry. Ethics are currently not pertinent issues for data scientists. Raising the question, should there be a line drawn in the sand relating to the social media industry between the privacy of participants and the research objectives, both monetarily and academic, of the corporations that control their data?
The manipulation struck a nerve amongst users, academics, and even politicians. Last Wednesday, senator Mark R. Warner (D-Va.) asked the Federal Trade Commission (FTC) to provide more information about recent reports that Facebook manipulated user news feeds during an emotional manipulation experiment. Warner is asking the agency to determine if Facebook violated their consent agreement with the FTC.
He argues in his letter that it is not clear whether Facebook users were adequately informed and given an opportunity to agree to the research given the sheer size and length of the user agreement. Most people do not read the “data use policy.”
Facebook has not been very contrite since this study came to light last week. Facebook’s COO Sheryl Sandberg spoke with Forbes earlier this week and essentially said Facebook is not sorry about conducting the contagion study. It was conducted in the normal course of business by their data team. It is only sorry that everyone is upset now.
Many people believe that Facebook crossed a line with the study, but do they really have a reason to apologize?
Sale or for-profit use of data isn’t uncommon, but social media sites generally try to keep those transactions away from the press, to avoid negative public reactions. Social media is considered somewhat of a marketing jackpot, due to the abundance of freely given information provided by users, but its personal nature also cultivates a sense of trust between users and sites. People also feel far more attached to social media than most other sites, and for Facebook to alter feeds without their knowledge and consent is seen by many as taking advantage of that trust.
The focus of the experiment may have also played a role in the reaction by users. Facebook controlled content to negatively influence users, as well as positively. For people that were already unhappy, depressed, or even suicidal, that negative influence could have been devastating. People may also feel manipulated because social media feeds are generally considered to be open for all to post without a corporate motive. Facebook manipulated their feeds and them. People just don’t like to feel manipulated, but that’s exactly what happens every day.
Consumers’ emotions are manipulated continuously by advertisers, politicians, marketing professionals, news organizations, etc. and this is generally accepted as something that is unavoidable, like death and taxes. The only real discernible difference is that Facebook’s data team was attempting to gain insight/measure the effect of that.
While most companies hide their results to give them an advantage over competitors, Facebook paraded their findings and methods for manipulating their targets’ emotions in a scientific journal. This allowed the general public a rare glimpse into how companies manipulate their emotions, and they didn’t like what they saw. Clearly their presentation tactics were misguided, but would that put Facebook on the same level as the advertisers, marketers and politicians? Should the fact that they published the study help to justify it? Do consumers simply like not to be reminded of how easily we are manipulated?
These questions are void of simple answers, but perhaps larger corporations should strive to implement and publish more studies of this kind for the mutual benefit of science and business. In any case, an ethical lesson can be gained from this. Whenever possible, all social scientists should inform their subjects that they are being influenced. As senator Warner does point out, there was no real informed consent, as is standard for typical social psychology experiments, which Facebook’s study appears to fall under. There is an obligation, rather implied or otherwise, for social scientists to gain the public’s trust.