Killer cure. An epidemiological model predicts Facebook usage will decline precipitously as people "recover" from interest in it.

Killer cure. An epidemiological model predicts Facebook usage will decline precipitously as people "recover" from interest in it.

Kerem Yucel/iStockphoto/Thinkstock

Is Facebook keeping you in a political bubble?

Does Facebook make it harder for liberals and conservatives to get along? For years, political scientists have wondered whether the social network’s news feed selectively serves up ideologically charged news while filtering out content from opposite political camps. Now, a study by Facebook's in-house social scientists finds that this does indeed happen, though the effect is very small.

Researchers call it the filter bubble: the personalized view of the Internet created through tech company algorithms. When you do a search on Google, for example, the results you get back will differ depending on what the company knows about you. For the most part, that filtering is useful—geologists and musicians get very different search results for "types of rock," for example. But some effects of filter bubbles can be less benign, such as when online shoppers are offered different prices based on data collected about them without their knowledge or consent.

Facebook's research team focused on the problem of political polarization, asking how much their news feed algorithm contributed. Although the posts scrolling along the site's news feed may seem like a live stream from your friends, the company uses an algorithm to filter out and rank those posts before they reach you. And that algorithm is constantly evolving. It used to be tied to "likes" and clicks, but after extensive research on how to capture people's deeper interests, the algorithm has been tweaked to rank content by a "relevance score." This score—tied to a “human element” developers say was missing previously—determines what users ultimately see in their news feeds. For example, a viral post about adorable kittens may generate a lot of likes, but because people grow weary of this kind of clickbait, it now has a lower relevance score.

But even if the algorithm results in a more pleasant experience for Facebook users, they could be losing out on more than just cute cat photos. For example, liberals and conservatives may rarely learn about issues that concern the other side simply because those issues never makes it into their news feeds. Over time, this could cause political polarization, because people are not exposed to topics and ideas from the opposite camp.

It wasn't hard to determine who the liberals and conservatives were for the study: People can declare their political affiliations on their Facebook profiles. The team mapped those political organizations to a 5-point ideological scale—from –2 for very conservative to +2 for very liberal—based on survey data. After limiting the population to American adults who log in at least 4 days per week, the researchers had just over 10 million test subjects. For content, they focused on news stories.

Determining the political flavor of that content was trickier. Rather than trying to measure the political slant of news stories based on semantic analysis, the team used a far more expedient method: The "political alignment" of each story was determined by the average political alignment of all the users who posted a link to that story. For example, the political alignment score of the average story in The New York Times came in at –0.62 (somewhat liberal) whereas the average Fox News story was +0.78 (somewhat conservative). The stories that mattered for this study were the "cross-cutting" ones, news articles with a liberal slant appearing in a conservative person's news feed or vice versa. Armed with all these metrics, the researchers crunched the numbers for every news story—both those that ended up on people's news feeds and those that were filtered out by the algorithm—between 2011 and 2012.

By comparing the two groups of stories, researchers found that Facebook's news feed algorithm does indeed create an echo chamber effect. But it is not as powerful as critics have feared. The algorithm made it only 1% less likely for users to be exposed to politically cross-cutting stories, the team reports online today in Science. "The power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals," the team concludes.

The results are "certainly good with regard to the filter bubble concerns," says Sinan Aral, a political scientist at the Massachusetts Institute of Technology in Cambridge, "but I'm not sure that 'We shouldn't be worried' is the story." For example, the study does not change the finding 3 years ago that Facebook creates a "strong herding bias" in how people vote, he says. So on the question of whether Facebook is a force for good or ill for democracy, Aral says, "the jury is still out."

Follow News from Science

A 3D plot from a model of the Ebola risk faced at different West African regions over time.
Dancing sneakers on pavement
siderailarticle x promo