Software // Social
Commentary
6/30/2014
01:05 PM
Thomas Claburn
Thomas Claburn
Commentary
Connect Directly
LinkedIn
Twitter
Google+
RSS
E-Mail
50%
50%

Facebook Researchers Toy With Emotions: Wake Up

A study about the effect of posts on Facebook users' emotions outraged some people. I call it a public service: Now you understand what ill-informed consent means.

Over the weekend, the Internet was "shocked, shocked" to find that Facebook has been manipulating users' emotions.

Facebook researchers showed that emotions can be contagious, so to speak, thereby validating what artists, marketers, and demagogues have known since the development of language and communication.

The 2012 study, led by Facebook data scientist Adam Kramer, found that, "Emotional states can be transferred to others via emotional contagion, leading them to experience the same emotions as those around them." If you've ever attended a sporting event, you might have suspected as much.

The researchers altered the News Feeds of 689,003 Facebook users by removing a percentage of positive posts or negative posts to gauge the effect on people's moods. They found that people's moods were influenced by what they saw.

[Not seeing the posts you expect to see? Take a look at: Facebook News Feed: 5 Changes.]

Other research, specifically a study conducted by researchers at the University of Michigan, has already suggested that Facebook use makes people unhappy with their lives and undermines social well-being. But Facebook users have chosen to inflict depression upon themselves through their envy-inducing following of Friends' lives; Kramer's study has elicited criticism for its failure to seek informed consent.

Facebook's findings should surprise no one who has experienced the spread of "viral" content -- or anyone who read (and understood) Facebook's Data Use policy. Yet all across the online landscape, people have decried Facebook's lack of ethics. It's as if Google's "don't be evil" motto has somehow set a high bar.

(Source: Pixabay)
(Source: Pixabay)

In a blog post, University of Maryland law professor James Grimmelmann wrote that the study is a scandal because it pollutes the hitherto pristine world of academia with the grubby trickery of online advertisers.

"This study is a scandal because it brought Facebook's troubling practices into a realm -- academia -- where we still have standards of treating people with dignity and serving the common good," Grimmelmann wrote. "The sunlight of academic practices throws into sharper relief Facebook's utter unconcern for its users and for society. The study itself is not the problem; the problem is our astonishingly low standards for Facebook and other digital manipulators."

Academia may have standards for treating people with dignity and serving the common good, but suggesting that those employed by universities operate by a different moral code than those in news media, advertising, politics, or other walks of life is a bit of a stretch.

History is rich with examples of unethical research like the Tuskegee syphilis experiment. Universities may now have codes of conduct informed by past mistakes, but such commitments hardly elevate educational institutions above the compromises. Questionable behaviors continue to be seen in private businesses and public institutions. Consider the ongoing debate about whether universities should ban research funding from tobacco companies. The privacy advocate Lauren Weinstein has wondered aloud whether Facebook's mood manipulation has had lethal consequences, but we know with certainty that tobacco has done as much.

Grimmelmann is right to bemoan our low standards for Facebook and other digital manipulators, but we do not demand enough of anyone. We buy iPhones and then wring our hands about the treatment of workers in China; we think it is enough to Like a cause to effect political change.

Facebook's Kramer has posted an explanation of the project and an apology. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," he wrote. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

To the contrary, Facebook should be commended for demonstrating that the manipulation of information -- whether you call it "advertising," "content," "stories," or "propaganda" -- has an effect. Facebook has done a public service. It has helped inform those who have given their ill-informed consent about what "I agree" and the free economy really mean.

Building a realistic cyber security risk profile for an organization is challenging. It's about framing metrics (many of which organizations probably already have) and tailoring them in such a way that they are contextualized and relevant. In the Making Cyber-Security Metrics Actionable webcast from Dark Reading, we'll explore what makes a good metric, how to tailor risk metrics, how to develop implementation strategies, and more. This webcast is available on demand.

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 2   >   >>
Number 6
100%
0%
Number 6,
User Rank: Moderator
7/2/2014 | 9:30:11 AM
IRB
"Academia may have standards for treating people with dignity and serving the common good,..."

Yes, they're called Institutional Review Boards- IRB's, and were put in place in part because of problems like the Tuskegee study. Look them up.

So a good question to ask is whether and why the University of Michigan's IRB approved this.
Thomas Claburn
50%
50%
Thomas Claburn,
User Rank: Author
7/1/2014 | 6:53:15 PM
Re: That was an apology?
Really, was there any real impact from this? Yes it was a breach of trust but it's not as if tilting the New Feed balance from cats videos to reports of accidents and the like caused a spike in suicides. Until there's proof Facebook's experiment did harm, we would be better off working about other proven depressants like drugs and alcohol that do actually play a role in harm.

 
Lorna Garey
0%
100%
Lorna Garey,
User Rank: Author
7/1/2014 | 5:43:42 PM
Re: That was an apology?
Maybe I'm just hopelessly cynical, but you get what you pay for. Facebook isn't maintaining all these data centers out of the goodness of its corporate personhood heart. It's using us and our data. If people don't like that, they can go back to personal phone calls and letters. Gasp!
Charlie Babcock
100%
0%
Charlie Babcock,
User Rank: Author
7/1/2014 | 5:10:28 PM
That was an apology?
Adam Kramer's apology for "all the anxiety the study caused" seems to overlook the fact that most reseasrchers seek the consent of the researched in advance, as opposed to pointing out their aniety after finding themselves the subject of research. Each day the faux apologies -- apologetic sounding words that place blame anywhere but on the apologist -- get a little worse. 
GAProgrammer
0%
100%
GAProgrammer,
User Rank: Strategist
7/1/2014 | 4:46:10 PM
Re: Parody
I agree Rob - If you hadn't said it, I was going to!
Lorna Garey
IW Pick
100%
0%
Lorna Garey,
User Rank: Author
7/1/2014 | 3:05:46 PM
Re: EXACTLY
Careful, someone might cite you as a source of info on the NSA's nefarious plot to make Facebook users sad and thus increase sales of Budweiser, which as everyone knows is a front for the FBI.
Laurianne
100%
0%
Laurianne,
User Rank: Author
7/1/2014 | 3:00:22 PM
Re: EXACTLY
Just wait til we learn the NSA was also involved in the experiment. Mwahaha.
Drew Conry-Murray
100%
0%
Drew Conry-Murray,
User Rank: Ninja
6/30/2014 | 6:33:19 PM
Re: Violation of user trust -- or what's left of it
I'm surprised this study passed the muster of an internal review board (which is required for peer-reviewed academic publishing). My wife has to jump through all kinds of hoops to consent from interview subjects for her research. Seems like if you're going to mess around with people's mental states, you'd need a consent form that isn't buried in a long terms-of-service notice that people don't read and may have clicked an agreement years and years ago.
pcharles09
100%
0%
pcharles09,
User Rank: Apprentice
6/30/2014 | 6:12:11 PM
Re: Violation of user trust -- or what's left of it
There should be an app that scans headlines & finds the associated Snopes link, if it exists.
Shane M. O'Neill
50%
50%
Shane M. O'Neill,
User Rank: Author
6/30/2014 | 5:13:51 PM
Re: Violation of user trust -- or what's left of it
That's a fair point. Though I'd argue sensationalized headlines are not as devious as what Facebook did. The reader can at least find out pretty quickly if they've been duped just by reading the story. But it's the same principle.  
Page 1 / 2   >   >>
Social is a Business Imperative
Social is a Business Imperative
The use of social media for a host of business purposes is rising. Indeed, social is quickly moving from cutting edge to business basic. Organizations that have so far ignored social - either because they thought it was a passing fad or just didnít have the resources to properly evaluate potential use cases and products - must start giving it serious consideration.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest - July 22, 2014
Sophisticated attacks demand real-time risk management and continuous monitoring. Here's how federal agencies are meeting that challenge.
Flash Poll
Video
Slideshows
Twitter Feed
InformationWeek Radio
Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.