The 2016 U.S. presidential election season is fast upon us, so you can be sure your Facebook News Feed will soon be filled with politically charged content, if it isn't already. And you might think that Facebook, knowing your political affiliation if you've identified one, is more likely to show you articles aligned with your beliefs, whichever way you lean. And you'd be right — but as it turns out, that bias in automatic selection is nowhere near as effective as your built-in bias against clicking links emanating from the other side of the debate. Facebook data scientists have shown this using data from over 10 million users: you're making your own political filter bubble.
The researchers examined what those millions of users ("de-identified," of course, so no names or identifying information) were exposed to when it came to partisan content, what their friend network looked like in terms of liberals and conservatives, and what content each user shared. They found that Facebook's algorithms tended to reduce the user's likelihood of being exposed to links that challenged their beliefs by just 1 percent — while users' own choices about what to click reduced their exposure by 4 percent.
And yes, there was a difference based on party. Conservatives were more likely to have liberal friends in their networks sharing "cross-cutting" content — but also less likely to click on that content.
The study appeared Thursday in the journal ScienceXpress.
IN-DEPTH
- Facebook Introduces New Tools to Help Prevent Suicide
- Facebook Clarifies Rules on Violence, Nudity
- Clinton, Cruz Campaign Launches Reign -- On Facebook