You’re thumbing through your Facebook newsfeed when a post from an acquaintance you completely forgot about jolts you mid-scroll. Maybe it’s a shared meme poking fun at your preferred political candidate, or an opposing proclamation on a touchy subject like gun control, or maybe it’s just a picture of them wearing or doing something that elicits a breathy scoff.
You think to yourself, “How’d this person escape my last purge…?” and then go to their page and, without a second thought, click “unfriend.” And like that, a feeling of contentment sets in as you resume scrolling through your curated feed of like-minded friends and highly targeted advertisements.
Why social media reinforcement bubbles exist
Without even realizing it, you have just made moves to strengthen your reinforcement bubble. But while we are partly to blame for our highly curated feeds — it's not all our fault. The social media reinforcement bubble has two primary contributing factors: self-perpetuated bubbles a la the illustration above, and digitally perpetuated bubbles that are out of our control.
We manually curate our own bubble
Regarding the former, we have a natural tendency to surround ourselves with like-minded people.
“We experience conflicting thoughts as actual psychological discomfort. Brain scanning has, in fact, revealed that cognitive dissonance activates emotional areas like the anterior insulae and dorsal anterior cingulate cortex,” says Don Vaughn, a neuroscientist at the department of Psychology at UCLA. “Given that we prefer to eschew negative experiences, it comes as no surprise that people avoid the immediate psychological discomfort from cognitive dissonance by simply not reading or listening to differing opinions.”
There’s an energy component involved, too, he adds. Essentially, processing new facts, ideas and perspectives requires actual neural effort. In other words, it forces our brain to reconfigure its web of connections to understand, assess and potentially incorporate the new knowledge it's being expose to. In that sense, it’s a neural bias to conserve energy, and hard to overwrite.
What’s in your filter bubble depends on who you are and it depends on what you do. But you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.
Eli Pariser, internet activist
Social media algorithms filter out reality
The second factor — not to be underestimated — is the social media “filter bubble,” a term coined by internet activist Eli Pariser. In his viral TED Talk, he defined this echo chamber as a “personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.”
Social media giants — including Google, Facebook and Twitter — use algorithms that are ever-changing and top secret, which ultimately create these filter bubbles.
“[The algorithms] are purposefully complicated to ensure the average person doesn’t figure them out,” says Lisa Strohman, a licensed clinical psychologist and founder of Digital Citizen Academy, an organization dedicated to helping people find balance between their lives and modern technology. “We do know that there are several methods in which ads are configured and displayed. The biggest is by gathering data that we, the users, provide willingly or unknowingly. This enables the giants to control or manipulate the price of advertising, and to even go as far as publishing their own ads or narrative if they wish.”
(If you’re curious, this exercise can shed partial light: Head to your Facebook feed, click the downward arrow on the right, and then go to Settings. From there click on Ads, then Your Information and Your Categories. This reveals a list of data points the website has on you for third-party advertising purposes, ranging from your political leanings, to hobbies, to household income, to how likely you are to engage with certain political content.)
“The reality is that all platforms now constantly feed us content that aligns with our own interests, friends and belief systems. They are able to take what we browse or post about and feed us back our own thoughts gathered from other social media followers as though we have hundreds and thousands of friends feeling the same way,” says Strohman.
They are able to take what we browse or post about and feed us back our own thoughts gathered from other social media followers as though we have hundreds and thousands of friends feeling the same way.
Dr. Lisa Strohman, founder of Digital Citizen Academy
The problem with reinforcement bubbles
Before going Marie Kondo on your social media feeds — or shrugging off the notion of reinforcement bubbles — consider their potential dangers:
- We overestimate the prevalence of our perspective: “Our brain constructs a model of the world from interactions with our environment. If all our interactions are one-sided, then our brain’s model will be biased,” says Vaughn. (Kind of like we’re all watching a movie with the same title, but with completely different storylines.) “This is one purported reason why many Democrats were upset at the 2016 election results.” Reinforcement bubbles can lead us to mistakenly believe that more people support our world view than is reality.
- Our empathy for others decreases: “My neuroscience research on empathy underscores the point that simple notions of ‘us’ and ‘them’ [affect] how our brain processes the pain of another. When ‘they’ are in pain, we simulate their experience less, and show less empathy,” says Vaughn. Ultimately, reinforcing our own beliefs hardens us against others.
- It inhibits authentic dialogue and true change: “Reinforcing our current feelings and thoughts makes us feel better,” says Strohman, “but when doing so we also lose the ability to elevate our ideas and collaborate on major issues that our nation is facing.” Openly discussing — and more importantly, hearing — each other on hot button issues is more likely to foster ideas and solutions that improve our world.
How to work around reinforcement bubbles
While there’s little we can do to impact existing algorithms, we can make personal strides toward bursting our reinforcement bubble (or at least allowing others to step inside).
- Adjust the filters you actually do control: “Working on how we manage our filters — specifically regarding news sources — is incredibly important. “Finding less biased sources, or focusing on listening to two separate feeds in a balanced way, can be very helpful in gaining important perspective,” says Strohman. Reconsider your books, podcasts, radio stations, magazines and newspapers, too.
- Don’t delete those you disagree with: Even if you’re not actively engaging with such acquaintances, exposing yourself to a variety of thought prevents you from overestimating the prevalence of your own perspective.
- Actively engage with someone who has differentiating views: “Reach out to someone that you can respect, and who is informed, that holds the opposing view to your position,” says Strohman. “[This helps] uncover and understand our hidden bias from our backgrounds.”
- Attend local debates, forums and rallies: Whether it’s an open forum for your neighborhood association or a political rally, attending local events where you can interact with people in real time opens the door to authentic dialogue and new perspectives. Go with the intention of listening, not arguing.
More on BETTER
- I stopped Googling everything, and this is what happened to my brain
- 3 simple ways to go on a Facebook diet — starting today
- In defense of social media: The amazing real-life experiences I've had thanks to Instagram
- We're overspending for the love of a 'like' on Instagram. Here's how to stop.
- What is TikTok? And is it safe? A guide for clueless parents