Facebook Mood Experiment Prompts New Guidelines

Facebook apologized for toying with users' emotions and introduced a new framework to improve its research process.
Cubicle Sins: 10 Coworkers Who Drive You Crazy
Cubicle Sins: 10 Coworkers Who Drive You Crazy
(Click image for larger view and slideshow.)

Three months after Facebook caused a firestorm over its mood-manipulation experiment, the social network has apologized and shared a framework it put in place to monitor its research more closely.

Facebook published research in the Proceedings of the National Academy of Sciences in March detailing how it tinkered with the news feed algorithm of nearly 700,000 users for one week in early 2012. Researchers found that, in instances where Facebook showed users more positive posts, users were more likely to share positive statuses. Conversely, when Facebook showed them more negative posts, they were more likely to share negative status messages.

Experts and users were outraged, calling the experiment unethical and even illegal. The Electronic Privacy Center (EIPC), a privacy watchdog group, filed an FTC complaint against Facebook in July, alleging that the social network deceived users and violated a 2012 consent order.

[Popular social apps may track your every move. Read Location Tracking: 6 Social App Settings To Check.]

On Thursday, Facebook CTO Mike Schroepfer opened up about the experiment and admitted that the social network was unprepared for users' reactions.

"It is clear now that there are things we should have done differently," he said in a blog post. "For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it."

The new framework, which covers both internal work and research that might get published, focuses on three areas: guidelines, review, and training.

Facebook's new research guidelines include an "enhanced" review process before research can begin and further review if the work involves external collaborators, such as someone from academia, Schroepfer said.

In addition, the social network created a panel consisting of employees from its research, legal, privacy, and policy departments to review projects, and it has committed to more extensive training for its employees. Facebook will also update a Research at Facebook page with all of its published papers.

Noticeably absent from Facebook's new framework is any mention that experiments involving users will discontinue. Instead, Schroepfer said that, because it "helps build a better Facebook," such research is necessary to the company.

"Facebook does research in a variety of fields, from systems infrastructure to user experience to artificial intelligence to social science," he said. "Like most companies today, our products are built on extensive research, experimentation, and testing."

EPIC said in a statement that Facebook's guidelines are a step in the right direction.

"The new guidelines have improved Facebook's research process, but they still raise questions about human-subject testing by advertising companies," the group said. "EPIC still believes the newsfeed algorithm should be made public."

Avoiding audits and vendor fines isn't enough. Take control of licensing to exact deeper software discounts and match purchasing to actual employee needs. Get the Software Licensing issue of InformationWeek today.

Editor's Choice
Brandon Taylor, Digital Editorial Program Manager
Jessica Davis, Senior Editor
Terry White, Associate Chief Analyst, Omdia
Richard Pallardy, Freelance Writer
Cynthia Harvey, Freelance Journalist, InformationWeek
Pam Baker, Contributing Writer