The experiments took place for 1 wk (January 11–18, 2012)
The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed
Focus:"...whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion."
groups, 1) exposure to friends’ positive emotional content in their News Feed was reduced, 2) exposure to negative emotional content in their News Feed was reduced., 3 & 4) Both experiments had a control condition, in which a similar proportion of posts in their News Feed were omitted entirely at random
Tool used to analyze the emotions within the posts: Linguistic Inquiry and Word Count software (LIWC2007)
"...It is important to note that this content was always available by viewing a friend’s content directly by going to that friend’s “wall” or “timeline..."
over 3 million posts were analyzed, containing over 122 million words, 4 million of which were positive (3.6%) and 1.8 million negative (1.6%).
Measured, Emotional expression was modeled, on a per-person basis, as the percentage of words produced by that person during the experimental period that were either positive or negative.
People in the positivity-reduced condition should express increased negativity, whereas people in the negativity-reduced condition should express increased positivity.
"...our experimental groups did not differ in emotional expression during the week before the experiment (all t < 1.5; all P > 0.13)..."
The results show emotional contagion.
These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks
"...the effect sizes from the manipulations are small (as small as d = 0.001)..."
"...[the study] was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.", We use the information we receive about you in connection with the services and features we provide to you and other users like your friends, our partners, the advertisers that purchase ads on the site, and the developers that build the games, applications, and websites you use. For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you: as part of our efforts to keep Facebook products, services and integrations safe and secure; to protect Facebook's or others' rights or property; to provide you with location features and services, like telling you and your friends when something is going on nearby; to measure or understand the effectiveness of ads you and others see, including to deliver relevant ads to you; to make suggestions to you and other users on Facebook, such as: suggesting that your friend use our contact importer because you found friends using it, suggesting that another user add you as a friend because the user imported the same email address as you did, or suggesting that your friend tag you in a picture they have uploaded with you in it; and for internal operations, including troubleshooting, data analysis, testing, research and service improvement.
Resources, Conducting Ethical Research in Psychology, Declaration of Helsinki - Ethical Principles for Medical Research Involving Human Subjects, Informed Consent - History of Informed Consent, Informed Consent - Levels of Consent, Nuremberg Trials (History Channel documentary), Informed consent: Definition & elements
Facebook Ran A Huge Psychological Experiment On Users And Manipulated The Emotions Of More Than 600,000 People
Facebook made users depressed in secret research: Site deleted positive comments from friends
Facebook reveals news feed experiment to control emotions
Facebook Reveals Secret Experiment To Control Your Emotions
"... emotional manipulation is still emotional manipulation, no matter how small of a sample it affected." - Kate Lefranc
"This was an affective intervention that should have required informed consent.... the Users Agreement does not qualify as informed consent in any way. It may provide legal cover for Facebook, but it is not informed consent." - Jeff Sherman
"you'll never know what impact this actually had on depressed people. You can only measure what they posted to Facebook, which isn't a particularly meaningful or realistic indicator of their emotional state...I trust you're a reasonable person who doesn't set out to cross ethical boundaries. But on this one, I think Facebook needs to admit it did and make some changes. This study was unethical by any reasonable standard. There's nothing wrong with admitting that and figuring out a way to do better...There's a lot wrong with going ahead with anything like this, ever again." - Sean Tucker
First, thanks for responding thoughtfully to the controversy. Second, citing the small # of participants & the small observed effect doesn’t address the potential for profoundly negative unmeasured effects. - Eliot Pancek
"It only affected a small number of people anyway"
Facebook manipulates your News Feed all the time anyway (usually for info on what posts you'll probably like best, thus what will keep you on the site longer)
The study examined the emotional content of words - they did NOT measure people's actual emotional state
Even if a small number of people were affected that's important. What if one of those people were your mother, brother, sister, good friend What if as a result of seeing these negative posts your good friend became more depressed? What if he/she committed suicide?