IE 11 is not supported. For an optimal experience visit our site on another browser.

Study Shows You Make Your Own Political 'Filter Bubble' on Facebook

Facebook's bias in selecting political articles close to your beliefs is nowhere near as effective as your own bias against clicking certain links.

The 2016 U.S. presidential election season is fast upon us, so you can be sure your Facebook News Feed will soon be filled with politically charged content, if it isn't already. And you might think that Facebook, knowing your political affiliation if you've identified one, is more likely to show you articles aligned with your beliefs, whichever way you lean. And you'd be right — but as it turns out, that bias in automatic selection is nowhere near as effective as your built-in bias against clicking links emanating from the other side of the debate. Facebook data scientists have shown this using data from over 10 million users: you're making your own political filter bubble.

The researchers examined what those millions of users ("de-identified," of course, so no names or identifying information) were exposed to when it came to partisan content, what their friend network looked like in terms of liberals and conservatives, and what content each user shared. They found that Facebook's algorithms tended to reduce the user's likelihood of being exposed to links that challenged their beliefs by just 1 percent — while users' own choices about what to click reduced their exposure by 4 percent.

This chart shows the likelihood of sharing partisan content increases as the user's own ideology tends further to the right or left.Facebook

And yes, there was a difference based on party. Conservatives were more likely to have liberal friends in their networks sharing "cross-cutting" content — but also less likely to click on that content.

The study appeared Thursday in the journal ScienceXpress.


—Devin Coldewey