A study about the effect of posts on Facebook users' emotions outraged some people. I call it a public service: Now you understand what ill-informed consent means.

Thomas Claburn, Editor at Large, Enterprise Mobility

June 30, 2014

4 Min Read
(Source: Pixabay)

Over the weekend, the Internet was "shocked, shocked" to find that Facebook has been manipulating users' emotions.

Facebook researchers showed that emotions can be contagious, so to speak, thereby validating what artists, marketers, and demagogues have known since the development of language and communication.

The 2012 study, led by Facebook data scientist Adam Kramer, found that, "Emotional states can be transferred to others via emotional contagion, leading them to experience the same emotions as those around them." If you've ever attended a sporting event, you might have suspected as much.

The researchers altered the News Feeds of 689,003 Facebook users by removing a percentage of positive posts or negative posts to gauge the effect on people's moods. They found that people's moods were influenced by what they saw.

[Not seeing the posts you expect to see? Take a look at: Facebook News Feed: 5 Changes.]

Other research, specifically a study conducted by researchers at the University of Michigan, has already suggested that Facebook use makes people unhappy with their lives and undermines social well-being. But Facebook users have chosen to inflict depression upon themselves through their envy-inducing following of Friends' lives; Kramer's study has elicited criticism for its failure to seek informed consent.

Facebook's findings should surprise no one who has experienced the spread of "viral" content -- or anyone who read (and understood) Facebook's Data Use policy. Yet all across the online landscape, people have decried Facebook's lack of ethics. It's as if Google's "don't be evil" motto has somehow set a high bar.

In a blog post, University of Maryland law professor James Grimmelmann wrote that the study is a scandal because it pollutes the hitherto pristine world of academia with the grubby trickery of online advertisers.

"This study is a scandal because it brought Facebook's troubling practices into a realm -- academia -- where we still have standards of treating people with dignity and serving the common good," Grimmelmann wrote. "The sunlight of academic practices throws into sharper relief Facebook's utter unconcern for its users and for society. The study itself is not the problem; the problem is our astonishingly low standards for Facebook and other digital manipulators."

Academia may have standards for treating people with dignity and serving the common good, but suggesting that those employed by universities operate by a different moral code than those in news media, advertising, politics, or other walks of life is a bit of a stretch.

History is rich with examples of unethical research like the Tuskegee syphilis experiment. Universities may now have codes of conduct informed by past mistakes, but such commitments hardly elevate educational institutions above the compromises. Questionable behaviors continue to be seen in private businesses and public institutions. Consider the ongoing debate about whether universities should ban research funding from tobacco companies. The privacy advocate Lauren Weinstein has wondered aloud whether Facebook's mood manipulation has had lethal consequences, but we know with certainty that tobacco has done as much.

Grimmelmann is right to bemoan our low standards for Facebook and other digital manipulators, but we do not demand enough of anyone. We buy iPhones and then wring our hands about the treatment of workers in China; we think it is enough to Like a cause to effect political change.

Facebook's Kramer has posted an explanation of the project and an apology. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," he wrote. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

To the contrary, Facebook should be commended for demonstrating that the manipulation of information -- whether you call it "advertising," "content," "stories," or "propaganda" -- has an effect. Facebook has done a public service. It has helped inform those who have given their ill-informed consent about what "I agree" and the free economy really mean.

Building a realistic cyber security risk profile for an organization is challenging. It's about framing metrics (many of which organizations probably already have) and tailoring them in such a way that they are contextualized and relevant. In the Making Cyber-Security Metrics Actionable webcast from Dark Reading, we'll explore what makes a good metric, how to tailor risk metrics, how to develop implementation strategies, and more. This webcast is available on demand.

About the Author(s)

Thomas Claburn

Editor at Large, Enterprise Mobility

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful master's degree in film production. He wrote the original treatment for 3DO's Killing Time, a short story that appeared in On Spec, and the screenplay for an independent film called The Hanged Man, which he would later direct. He's the author of a science fiction novel, Reflecting Fires, and a sadly neglected blog, Lot 49. His iPhone game, Blocfall, is available through the iTunes App Store. His wife is a talented jazz singer; he does not sing, which is for the best.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights