Facebook's failure to communicate about its mood experiment is the least of the things Internet companies do to us.
Facebook Privacy: 10 Settings To Check
(Click image for larger view and slideshow.)
Facebook COO Sheryl Sandberg apologized on Wednesday for the company's undisclosed psychological experimentation on Facebook users and acknowledged that the research effort was "poorly" communicated, a word which here means "not."
According to the Wall Street Journal, Sandberg, while in New Delhi, remarked, "We never meant to upset you," echoing Facebook researcher Adam Kramer's claim that "our goal was never to upset anyone."
In fact, the study at issue, published recently by researchers from Facebook, the University of California, and Cornell University, looks a lot like it was designed to test the social network's ability to upset (and excite) people. In January 2012, it exposed some 700,000 people to News Feeds weighted with either positive or negative posts and images to test whether users' emotions could be swayed.
The researchers concluded that emotional states can indeed be influenced by what people see and read. This is more or less what marketers, artists, and politicians have known since forever. But Facebook users were upset, evidently because this is different from Facebook's publicly disclosed manipulation of users' News Feeds.
Beyond Cornell's curious repudiation of a previous statement that the Army Research Office contributed funding to the research -- let's test Facebook as a tool for regime change! -- the controversy surrounding the study consists of debates about ethics and informed consent.
The study certainly looks to be ethically dubious, but social media itself is ethically dubious. It's based on an asymmetrical exchange: something of known value -- a communications service -- for something of unknown value -- personal data, privacy, and user-generated content. The asymmetry is magnified because Facebook has some idea of the value each user brings to its network.
Yet those seeking to complain about Facebook's failure to disclose its experiment without doing the obvious -- quitting Facebook -- would do better to protest more substantive issues. Here's a 10-course tasting menu of more worthy concerns.
1. Technical paternalism Technology companies make choices that limit how you can use their software, hardware, and services. Facebook insists on filtering users' News Feeds when it could put users in control of the filter. Apple insists on judging apps by different standards than books, in terms of what kind of content is allowed. Google won't allow ad blocking software in Google Play. Technology companies treat customers like children.
2. Changeable contracts Technology companies, along with banks, utilities, and a host of companies in other industries, frequently claim the right to unilaterally change terms-of-service agreements at their discretion, sometimes with and sometimes without notice. Imagine that in the context of a landlord renting to a tenant. After signing a lease for $1,000 a month, the landlord could say the contract has changed and the rent is now $10,000 a month. Simply put, unilateral contractual changes should not be allowed.
3. Corporations are more than people The Supreme Court's decision to treat corporations as people in the context of political funding elicited a fair amount of resentment among those who believe America is a nation governed by people rather than companies. But corporations can do things people cannot, like create shell companies to conceal information and to shift revenue abroad. Firms like Apple, Facebook, Google, and LinkedIn have been criticized for their ostensibly lawful tax mitigation schemes, which can move money away from regions where the companies actually consume considerable resources. Taxes that don't get paid matter more than consent that hasn't been obtained.
4. Farcical privacy policies You would think that companies with privacy policies would provide privacy. But you would be wrong. Facebook at least has the decency to offer a Data Use Policy. Right up front, you know you will be used. But such documents are really a farce because so few people read them and truly understand them.
5. Cloud insecurity Between 446 and 662 data breaches have occurred every year since 2007, according to the Identity Theft Resource Center. Meanwhile, law enforcement organizations and intelligence services like the NSA have the power to grab just about any data from anywhere. Online security is a pipe dream, yet companies insist, "We take security very seriously." They'll take your money, but can't take care of your data with any certainty. Trust no one; store your own data encrypted on a local machine.
6. Cloud impermanence Google may be the poster child for capricious termination of cloud services, but it's far from the only company to withdraw offerings from the market in a way that inconveniences consumers. Back when software ran on local machines, this was less of a problem; today, with so many server-resident applications, important services can simply vanish. The cloud erodes the power that comes with ownership. Welcome to the cloud, serf.
7. Cloud filth How many shared links does it take to sink an island country beneath the rising sea? Stay tuned for the viral video about the impact of
Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Cybersecurity Strategies for the Digital EraAt its core, digital business relies on strong security practices. In addition, leveraging security intelligence and integrating security with operations and developer teams can help organizations push the boundaries of innovation.