Facebook Researchers Toy With Emotions: Wake Up - InformationWeek
Software // Social
01:05 PM
Thomas Claburn
Thomas Claburn
Connect Directly

Facebook Researchers Toy With Emotions: Wake Up

A study about the effect of posts on Facebook users' emotions outraged some people. I call it a public service: Now you understand what ill-informed consent means.

Over the weekend, the Internet was "shocked, shocked" to find that Facebook has been manipulating users' emotions.

Facebook researchers showed that emotions can be contagious, so to speak, thereby validating what artists, marketers, and demagogues have known since the development of language and communication.

The 2012 study, led by Facebook data scientist Adam Kramer, found that, "Emotional states can be transferred to others via emotional contagion, leading them to experience the same emotions as those around them." If you've ever attended a sporting event, you might have suspected as much.

The researchers altered the News Feeds of 689,003 Facebook users by removing a percentage of positive posts or negative posts to gauge the effect on people's moods. They found that people's moods were influenced by what they saw.

[Not seeing the posts you expect to see? Take a look at: Facebook News Feed: 5 Changes.]

Other research, specifically a study conducted by researchers at the University of Michigan, has already suggested that Facebook use makes people unhappy with their lives and undermines social well-being. But Facebook users have chosen to inflict depression upon themselves through their envy-inducing following of Friends' lives; Kramer's study has elicited criticism for its failure to seek informed consent.

Facebook's findings should surprise no one who has experienced the spread of "viral" content -- or anyone who read (and understood) Facebook's Data Use policy. Yet all across the online landscape, people have decried Facebook's lack of ethics. It's as if Google's "don't be evil" motto has somehow set a high bar.

(Source: Pixabay)
(Source: Pixabay)

In a blog post, University of Maryland law professor James Grimmelmann wrote that the study is a scandal because it pollutes the hitherto pristine world of academia with the grubby trickery of online advertisers.

"This study is a scandal because it brought Facebook's troubling practices into a realm -- academia -- where we still have standards of treating people with dignity and serving the common good," Grimmelmann wrote. "The sunlight of academic practices throws into sharper relief Facebook's utter unconcern for its users and for society. The study itself is not the problem; the problem is our astonishingly low standards for Facebook and other digital manipulators."

Academia may have standards for treating people with dignity and serving the common good, but suggesting that those employed by universities operate by a different moral code than those in news media, advertising, politics, or other walks of life is a bit of a stretch.

History is rich with examples of unethical research like the Tuskegee syphilis experiment. Universities may now have codes of conduct informed by past mistakes, but such commitments hardly elevate educational institutions above the compromises. Questionable behaviors continue to be seen in private businesses and public institutions. Consider the ongoing debate about whether universities should ban research funding from tobacco companies. The privacy advocate Lauren Weinstein has wondered aloud whether Facebook's mood manipulation has had lethal consequences, but we know with certainty that tobacco has done as much.

Grimmelmann is right to bemoan our low standards for Facebook and other digital manipulators, but we do not demand enough of anyone. We buy iPhones and then wring our hands about the treatment of workers in China; we think it is enough to Like a cause to effect political change.

Facebook's Kramer has posted an explanation of the project and an apology. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," he wrote. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

To the contrary, Facebook should be commended for demonstrating that the manipulation of information -- whether you call it "advertising," "content," "stories," or "propaganda" -- has an effect. Facebook has done a public service. It has helped inform those who have given their ill-informed consent about what "I agree" and the free economy really mean.

Building a realistic cyber security risk profile for an organization is challenging. It's about framing metrics (many of which organizations probably already have) and tailoring them in such a way that they are contextualized and relevant. In the Making Cyber-Security Metrics Actionable webcast from Dark Reading, we'll explore what makes a good metric, how to tailor risk metrics, how to develop implementation strategies, and more. This webcast is available on demand.

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
<<   <   Page 2 / 2
Thomas Claburn
Thomas Claburn,
User Rank: Author
6/30/2014 | 4:51:32 PM
Re: Violation of user trust -- or what's left of it
>But this is still a breach of user trust.

This could have been handled better but it's hypocritical for anyone in the press to point fingers. Emotional manipulation is the lifeblood of the news business. The news biz maxim "if it bleeds, it leads" exists precisely because headlines can push emotional buttons. 
User Rank: Ninja
6/30/2014 | 4:01:38 PM
Re: Violation of user trust -- or what's left of it
Being a user of Facebook, I am sad to read this. Sadly its legal too as written in their terms and conditions. Surely violation of user trust.
Shane M. O'Neill
Shane M. O'Neill,
User Rank: Author
6/30/2014 | 2:42:59 PM
Violation of user trust -- or what's left of it
I would say that Facebook should have been more transparent about what they were doing -- i.e. tell people about the "experiment" and let them accept or decline being part of it. But if people knew what was happening, their emotions wouldn't have been manipulated, now would they.

But this is still a breach of user trust. If you're intentionally not giving people the news feed they should be getting and not telling them, that's a violation. It warrants the bad PR and online outrage. But it will pass. Makes me glad I hid most of the people in my Facebook network and now use the site as a RSS news feed.
Lorna Garey
Lorna Garey,
User Rank: Author
6/30/2014 | 2:19:08 PM
Thank you - spot on analysis. And, may I add, that data scientist's non-apology is great as well. Nice use of the English language to basically say, "Get over yourselves, morons. What did you THINK we were doing?"
Thomas Claburn
Thomas Claburn,
User Rank: Author
6/30/2014 | 2:07:17 PM
Re: Inaccurate statement
Thanks, I'll get that corrected.
User Rank: Apprentice
6/30/2014 | 1:58:05 PM
Inaccurate statement
"The researchers altered the News Feeds of 689,003 Facebook users by removing all positive posts or all negative posts to gauge the effect on people's moods."


That is incorrect.  Each emotionally charged post had a 10%-90% chance of being removed, depending on user ID.
IW Pick
User Rank: Moderator
6/30/2014 | 1:50:08 PM
isn't it interesting
The ethics of Facebook's moves aside, isn't it interesting how people will willfully spill their guts out on Facebook -- including putting lots of things that don't make them look that good, whether or not they realize it at the time -- but then get all perturbed when Facebook essentially creates a "mood index" for the content they provide? 

To me, this whole thing could be solved relatively easily if people would be just a bit more judicious in terms of what they share with the world. And, odds are, their friends would thank them as well because they likely don't want to read all of that either. 
User Rank: Author
6/30/2014 | 1:41:18 PM
Facebook's practices aside, Prof. Grimmelmann does parody well. Academia is the last bastion of "treating people with dignity and serving the common good"? Except when there's grant money or tenure on the line, or someone forwards a POV that's not in line with the orthodoxy.   

<<   <   Page 2 / 2
Register for InformationWeek Newsletters
White Papers
Current Issue
Digital Transformation Myths & Truths
Transformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll