Facebook Mood Manipulation: 10 Bigger Problems - InformationWeek
Software // Social
09:06 AM
Thomas Claburn
Thomas Claburn
Connect Directly

Facebook Mood Manipulation: 10 Bigger Problems

Facebook's failure to communicate about its mood experiment is the least of the things Internet companies do to us.

Facebook Privacy: 10 Settings To Check
Facebook Privacy: 10 Settings To Check
(Click image for larger view and slideshow.)

Facebook COO Sheryl Sandberg apologized on Wednesday for the company's undisclosed psychological experimentation on Facebook users and acknowledged that the research effort was "poorly" communicated, a word which here means "not."

According to the Wall Street Journal, Sandberg, while in New Delhi, remarked, "We never meant to upset you," echoing Facebook researcher Adam Kramer's claim that "our goal was never to upset anyone."

In fact, the study at issue, published recently by researchers from Facebook, the University of California, and Cornell University, looks a lot like it was designed to test the social network's ability to upset (and excite) people. In January 2012, it exposed some 700,000 people to News Feeds weighted with either positive or negative posts and images to test whether users' emotions could be swayed.

[Protect your data. See 4 Facebook Privacy Intrusion Fixes.]

The researchers concluded that emotional states can indeed be influenced by what people see and read. This is more or less what marketers, artists, and politicians have known since forever. But Facebook users were upset, evidently because this is different from Facebook's publicly disclosed manipulation of users' News Feeds.

Beyond Cornell's curious repudiation of a previous statement that the Army Research Office contributed funding to the research -- let's test Facebook as a tool for regime change! -- the controversy surrounding the study consists of debates about ethics and informed consent.

The study certainly looks to be ethically dubious, but social media itself is ethically dubious. It's based on an asymmetrical exchange: something of known value -- a communications service -- for something of unknown value -- personal data, privacy, and user-generated content. The asymmetry is magnified because Facebook has some idea of the value each user brings to its network.

Yet those seeking to complain about Facebook's failure to disclose its experiment without doing the obvious -- quitting Facebook -- would do better to protest more substantive issues. Here's a 10-course tasting menu of more worthy concerns.

1. Technical paternalism
Technology companies make choices that limit how you can use their software, hardware, and services. Facebook insists on filtering users' News Feeds when it could put users in control of the filter. Apple insists on judging apps by different standards than books, in terms of what kind of content is allowed. Google won't allow ad blocking software in Google Play. Technology companies treat customers like children.

(Source: Kevin Trotman)
(Source: Kevin Trotman)

2. Changeable contracts
Technology companies, along with banks, utilities, and a host of companies in other industries, frequently claim the right to unilaterally change terms-of-service agreements at their discretion, sometimes with and sometimes without notice. Imagine that in the context of a landlord renting to a tenant. After signing a lease for $1,000 a month, the landlord could say the contract has changed and the rent is now $10,000 a month. Simply put, unilateral contractual changes should not be allowed.

3. Corporations are more than people
The Supreme Court's decision to treat corporations as people in the context of political funding elicited a fair amount of resentment among those who believe America is a nation governed by people rather than companies. But corporations can do things people cannot, like create shell companies to conceal information and to shift revenue abroad. Firms like Apple, Facebook, Google, and LinkedIn have been criticized for their ostensibly lawful tax mitigation schemes, which can move money away from regions where the companies actually consume considerable resources. Taxes that don't get paid matter more than consent that hasn't been obtained.

4. Farcical privacy policies
You would think that companies with privacy policies would provide privacy. But you would be wrong. Facebook at least has the decency to offer a Data Use Policy. Right up front, you know you will be used. But such documents are really a farce because so few people read them and truly understand them.

5. Cloud insecurity
Between 446 and 662 data breaches have occurred every year since 2007, according to the Identity Theft Resource Center. Meanwhile, law enforcement organizations and intelligence services like the NSA have the power to grab just about any data from anywhere. Online security is a pipe dream, yet companies insist, "We take security very seriously." They'll take your money, but can't take care of your data with any certainty. Trust no one; store your own data encrypted on a local machine.

6. Cloud impermanence
Google may be the poster child for capricious termination of cloud services, but it's far from the only company to withdraw offerings from the market in a way that inconveniences consumers. Back when software ran on local machines, this was less of a problem; today, with so many server-resident applications, important services can simply vanish. The cloud erodes the power that comes with ownership. Welcome to the cloud, serf.

7. Cloud filth
How many shared links does it take to sink an island country beneath the rising sea? Stay tuned for the viral video about the impact of

Next Page

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
1 of 2
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
<<   <   Page 2 / 2
User Rank: Apprentice
7/6/2014 | 1:06:46 AM
You get what you pay for?
This is stuff of 1984.  Why not let college campuses that adhere to ethical research standards voluntarily ban FB on college networks?  Why not outlaw FB for use by minors?


But, NSA invasions of privacy, the Patriot Act, warrantless police seizures of cell phones, Apple collusion to provide suppressive anti-recording technologies to law agencies, and so forth.  Why do you pay for these "features".  Technological innovations subvert human rights and is it worse that we think we can close the barn door after the horse has escaped or that we continue to pay these corporations and use their services knowing they oppose the well being of the human?  Vote with your wallet.
User Rank: Apprentice
7/6/2014 | 12:28:47 AM
First rule of Ad Club: Don't talk about feelings!
The uproar over this study is nuts.  For over fifty years corporate America has engaged in a practically no holds barred arms race to manipulate people's purchasing behavior largely based on things like impulse control (or rather the lack thereof), aspiration, and sex and power drive.  The majority of the public seems to have been fine with this and the media too, heck there have even been hugely popular and critically acclaimed shows like Mad Men on exactly this topic.  But when Facebook uses those very same methods to simply study people's behavior without trying to get them to buy some particular product there is this huge outcry.  I guess the real rule of social media business is : Shhhhh - don't wake up the sheep!
User Rank: Apprentice
7/5/2014 | 4:40:53 PM
Facebook isn't the problem here; it's the universities
I am not surprised at Facebook doing this, and I think it's clear, or shoudl be quite clear, that their agreement with users allows this kind of thing.  Anyway, it's clear these social network for-profit corporations employ algorithms filtering what we see in "timelines" and other so-called such views; that's one reason I don't see why users keep those filters/algorithms on and don't just use the raw stream.

The problem is that universities conducted ethics-impaired research.  It isn't only that the research subjects were unaware of the research, but also that the research did manipulate said users and had the potential to do emotional damage.  The code of ethics for any such social rsearch is clear in academic instiutions of any caliber.  This is a real outrage.  We should expect private companies to do this, especially when our user agreements with them are fairly clear, but we should not expect our academic institutions to take advantage as occurred here.  Especially as public funding and private donations can go ot such efforts individually and anyway support the operations of such institutions.
User Rank: Apprentice
7/5/2014 | 12:51:55 PM
Manipulation of US citizens
Corrupt USA government and the whole US economy is a Ponzi scheme 
<<   <   Page 2 / 2
How Enterprises Are Attacking the IT Security Enterprise
How Enterprises Are Attacking the IT Security Enterprise
To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Register for InformationWeek Newsletters
White Papers
Current Issue
IT Strategies to Conquer the Cloud
Chances are your organization is adopting cloud computing in one way or another -- or in multiple ways. Understanding the skills you need and how cloud affects IT operations and networking will help you adapt.
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll