Blog

Emotions and Facebook: A Simple Research Study or Cause for Concern?

Share Button

Facebook recently unleashed details surrounding a secret emotion manipulation study they ran, and the study has been the cause of recent uproar amongst both the media industry and the American public.  However, the social media giant routinely conducts tests on data provided by its users for a number of reasons, especially to update their algorithm and ensure users are served the most intriguing and relevant content.  In the case of this study, Facebook attempted to measure emotional contagion and determine how users’ sentiment was affected by “positive” or “negative” content in their newsfeeds.

But, in this particular instance Facebook failed to disclose to users that they were omitting key posts containing keywords associated with the respective emotions from newsfeeds, and that they were using the resulting data for a large-scale research study.  While this makes Facebook’s research department appear uninhibited to take liberties at its discretion, is this social experiment really the monstrosity the media and public are decrying, or is this simply just a case of miscommunication and scientific language breeding fear out of banality?

Let’s objectively review the facts of the study:

  • The study involved removing select pieces of content from individuals’ newsfeeds
  • No additional content was shown in its place (i.e. no kittens dancing)
  • Research was only conducted for one week in 2012
  • Data was not attributed to any one individual profile
  • Research subjects remained anonymous throughout the experiment
  • The research was apparently conducted on a ‘pre-existing dataset’ and approved by an Institutional Review Board (IRB)
  • Any content that was removed from the newsfeed may have reappeared later

While on the surface it doesn’t appear there’s much to be concerned about, social perception plays a large role in ethics. Facebook conducted the study with the ultimate goal of improving user experience, a common practice.  The problem arose when the research terminology Facebook used to interpret their findings and describe them to the general public failed to accurately communicate how the study had been conducted and that ethical guidelines were followed.

Additionally, the research yielded minimal actionable results other than proving that the transfer of emotions on Facebook mirrors the patterns of physical, face-to-face interaction: a boring newsfeed encourages boring Facebook Posts.  The research community has also pointed out that the study may have been plagued by design flaws and further hindered by reliance on imperfect sentiment tools.  In 2012, when the study was conducted, the tools available to measure social sentiment were rather unsophisticated, and as a result, the accuracy of the results are questionable.

The question then logically follows: how was the study’s presentation handled?  Unfortunately, not very well.  Analyzing the content on TechCrunch, Mashable, or AdAge indicates that negative sentiment stems from Facebook’s attempt to change the newsfeed to “manipulate emotional response.”  Ironically, while Facebook didn’t manipulate users in this particular study, it shouldn’t come as a shock that the newsfeed feature certainly does.  Facebook employs a carefully guarded algorithm that analyzes a multitude of different variables to update a user’s newsfeed to showcase the ‘most relevant and engaging content.’  In the future, as Facebook continues research they will need to improve upon their delivery and research language when conducting honest marketing optimizations in order to avoid negative backlash.

While Facebook may not have crossed the line, the way they communicated their actions certainly brought them narrowly to the edge of socially acceptable.  But, as long as users continue to click “accept” without reading the fine print, this type of outrage will endure.

So, the best lesson gleaned from Facebook’s study seems that as far as ethics and experimentation go, if the public views the study as “morally wrong,” it doesn’t matter who is “legally right.”

Share Button
Category: Facebook, Social