IE 11 is not supported. For an optimal experience visit our site on another browser.

YouTube's conspiracy video problem is just getting worse, researcher says

The shooting in Parkland, Florida is only the most recent tragic event to feed a growing network of misinformation videos
Image: A man stands in front of YouTube's logo at an office in London
A growing network of conspiracy videos is only getting stronger, according to a researcher.Chris Ratcliffe / Bloomberg via Getty Images file

There is a vast network of conspiracy videos on YouTube that feeds off tragic events — including the recent shooting in Parkland, Florida — according to a prominent misinformation researcher.

Jonathan Albright, research director for the Tow Center for Digital Journalism at Columbia University, has studied misinformation on YouTube going back to the 2016 presidential election. But even he was surprised by what he found after mapping out the videos that YouTube suggested alongside videos alleging conspiracies connected to the shooting at Marjory Stoneman Douglas High School.

“I didn’t expect to be shocked when I looked at the results,” Albright wrote in a Medium post published Sunday.

What surprised Albright wasn’t the existence of conspiracy videos. The internet has served as a platform for the paranoid to share their thoughts since its earliest days, a role that first caught wider notice after theories about the 9/11 terrorist attacks reached a growing audience.

Instead, Albright focused on the finding that the many thousands of conspiracy videos that he could identify were being pushed by YouTube’s “Up next” recommendations, which persuade users to stay on the site and watch more videos.

He found that the series of videos served up by the Google-owned platform created a self-reinforcing network that grew stronger as more tragic events occurred.

“Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value,” Albright wrote. “The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.”

Albright looked at a set of 9,000 clips that he found by searching for “crisis actor” videos on YouTube. A series of conspiracy theories posited that student survivors of the Parkland shooting, who had appeared on TV calling for greater gun control, were not students of the high school at all but rather people who had fabricated their experience for money or attention. The theories have been debunked.

YouTube did not immediately respond to a request for comment.

The scale of the problem is not the focus of Albright’s work, but it is notable. YouTube went under the radar for its role in spreading disinformation around the 2016 election, with Facebook and Twitter getting most of the attention.

The network thrives because of the algorithm that YouTube uses to determine which videos to suggest to a user. YouTube displays a list of suggested videos anytime a user looks at any video on its platform. How that list is created is a secret, much like the algorithms that run Google’s search engine and Facebook’s news feed, but it is known to be optimized to keep users on YouTube for as long as possible.

Albright’s research comes as tech companies face growing criticism over the content that they host, surface and distribute. In response, they have become more willing to remove misinformation or extremist content, or at least limit its spread.

Those efforts have not stopped conspiracy theories from spreading, particularly around breaking news. One conspiracy video about a Parkland student hit YouTube’s “Trending” page, helping it amass more than 200,000 views.

YouTube remains just one part of the broader problem with the spread of what is now widely called “fake news” on the internet, but Albright noted that the platform plays a central role.

“From my experience, in the disinformation space, all roads seem to eventually lead to YouTube,” he wrote.