Facebook: Echo Chamber Is Real, But It's Your Fault
While 30% of news articles shared by users cut across ideological lines, the survey of 10 million Facebook users indicate bias is still present. However, Facebook says that's the fault of users.
6 Ways To Master The Data-Driven Enterprise
6 Ways To Master The Data-Driven Enterprise (Click image for larger view and slideshow.)
Social media behemoth Facebook acts as more than a hub of international relationships, wedding photos, and news feeds. It can also serve as an echo chamber where users digest opinions they already agree with, a study in the journal Science indicated.
The survey, which was overseen by Facebook's in-house data scientists, reviewed 10.1 million US accounts anonymously, finding those with a liberal bent tend to encounter a narrower field of opinions, receiving less than a quarter (24%) of their hard news from conservatives, while right-leaning users get 35% of their hard news from liberals.
The findings suggest individual choices, more than algorithms, appear to limit exposure to attitude-challenging content. That means people who feel they are only getting one point of view should start making friends who think in different ways.
In contrast, if individual users acquired information from random others, approximately 45% of the hard content liberals would be exposed to would be cross-cutting, compared to 40% for conservatives.
However, the findings also suggest the filter bubble is less robust than previously thought -- a criticism that led Facebook to conduct the study in the first place.
"Our work shows that social media exposes individuals to at least some ideologically cross-cutting viewpoints," the study, written by researchers at the University of Michigan, noted. "The power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals."
The results also indicate the wealth of information that analyzing sets of big data can uncover. Facebook's dataset allowed researches to measure a number of analytics. This permitted them to compare the ideological diversity of the broad set of news and opinion shared on Facebook with that shared by individuals' friend networks. The researchers then compared the result to the subset of stories that appear in individuals' algorithmically-ranked News Feeds.
It also gave researchers the opportunity to observe what information individuals choose to consume, once given exposure on News Feed.
[Read how Wal-Mart uses data.]
Facebook's view of the results contrasts somewhat with the media narrative, shown in a post on the company's research blog written by Eytan Bakshy, Solomon Messing, and Lada Adamic. The three argue that people are exposed to a substantial amount of content from friends with opposing viewpoints.
Your bias is clear.
"We found that people have friends who claim an opposing political ideology, and that the content in peoples' News Feeds reflect[s] those diverse views," the three wrote. "While News Feed surfaces content that is slightly more aligned with an individual's own ideology (based on that person's actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter."
The news from Facebook caps a week that shows how companies are using big data analytics to delve into different patterns and better understand how people behave. In Cuba, the tourism industry is using analytics to anticipate a flood of new visitors from America, while IBM is urging developers to use Watson more in healthcare.
[Did you miss any of the InformationWeek Conference in Las Vegas last month? Don't worry: We have you covered. Check out what our speakers had to say and see tweets from the show. Let's keep the conversation going.]
About the Author
You May Also Like