IE 11 is not supported. For an optimal experience visit our site on another browser.

As algorithms take over, YouTube's recommendations highlight a human problem

A supercomputer playing chess against your mind to get you to keep watching.
Back view of a man in front of bright computer working in the dark finishing work at midnight.
Every day YouTube serves around one billion users who watch billions of hours of video.Artur Debat / Getty Images

YouTube is a supercomputer working to achieve a specific goal — to get you to spend as much time on YouTube as possible.

But no one told its system exactly how to do that. After YouTube built the system that recommends videos to its users, former employees like Guillaume Chaslot, a software engineer in artificial intelligence who worked on the site's recommendation engine in 2010-2011, said he watched as it started pushing users toward conspiracy videos. Chaslot said the platform’s complex “machine learning” system, which uses trial and error combined with statistical analysis to figure out how to get people to watch more videos, figured out that the best way to get people to spend more time on YouTube was to show them videos light on facts but rife with wild speculation.

Routine searches on YouTube can generate quality, personalized recommendations that lead to good information, exciting storytelling from independent voices, and authoritative news sources.

But they can also return recommendations for videos that assert, for example, that the Earth is flat, aliens are underneath Antarctica, and mass shooting survivors are crisis actors.

Within a few clicks, a search for "Saturn" and other science topics on YouTube lead to recommendations for clusters of conspiracy and propaganda videos.
Within a few clicks, a search for "Saturn" and other science topics on YouTube lead to recommendations for clusters of conspiracy and propaganda videos.YouTube

This isn’t just a YouTube problem. Chaslot’s research on YouTube, which he released earlier this year, added to growing concerns about the pervasiveness of similar algorithms throughout modern society.

“They’re used to give us a credit score, to decide whether you get a job interview, evaluate your college application. They all use algorithms,” said data scientist Cathy O’Neil, author of “Weapons of Math Destruction.”

“All those bureaucratic decision-making systems,” use algorithms, she said, even as part of the criminal sentencing process. Several states are using these computer models to decide lengths of prison time or set bail amounts — which some critics have accused of perpetuating racial bias.

Algorithms trained by human data are now present in the daily lives of billions of people. And 2.2 billion of them are on YouTube.

On the site, the ease with which a person on can be transported from any innocuous search to the lunatic fringe of YouTube is startling. This reporter was helping his son research outer space for his school project. When he searched for "Saturn," the first results were mostly documentaries. One of the recommended videos was "10 facts you didn't know about space." That video led to additional recommendations such as "can you believe it" videos, a synthesized voice reading Nostradamus predictions and a clip "they don't want you to see" of pro-Putin propaganda.

What had started out as a simple search for fun science facts for kindergartners had quickly led to a vast conspiracy ecosystem.

Down the rabbit hole

Every day YouTube serves around one billion users who watch billions of hours of video. People use it to search for the latest music videos, learn how to fix their car and write research papers. The site’s videos also make it into Google search results, expanding their reach to potentially billions of searches per day.

YouTube, as one of our primary windows into the world, is shaping our understanding of it. The massive amounts of information available inside has led CEO Susan Wojcicki to call the site a “library.”

And YouTube is just part of it. Google’s search algorithm and Facebook’s News Feed algorithm also serve as filters for information for billions of people.

“There is an infrastructure built since the Renaissance to ensure the integrity of information and knowledge in libraries, many layers of gatekeepers thick,” wrote David Carroll, an associate professor of media design at The New School and a known critic of online platforms, in an email. “YouTube dispenses with all of it in the name of frictionless content acquisition and an algorithm optimized for watch-time to sell ads.”

Chaslot, a software engineer in artificial intelligence, worked on a project to introduce diversity to YouTube’s video recommendations starting in 2010. It didn’t do as well for watch time, he said, so it was shut down and not used.

"This is dangerous because this is an algorithm that's gaslighting people to make them believe that everybody lies to them just for the sake of watch time," he said.

He was moved to another group but tried to keep the original project going. According to a YouTube spokesperson, Chaslot was eventually fired in 2013 over performance issues.

Once on the outside, Chaslot said he created a program to analyze how the algorithm was recommending conspiracy videos by using a YouTube account with no viewing history to search for certain topics and collect which videos were recommended to users most.

This means that while “good” or “harmless” videos might be included in the mix of recommendations, YouTube repeatedly invited users to click on certain videos much more than others, essentially giving them free advertising. Chaslot initially shared his research with The Guardian.

His analysis found that when searching for "Is the Earth flat or round?" the top recommendation YouTube kept showing users in the beginning of February was "THE BEST Flat Earth VIDEO | 100% Proof The Earth Is Flat | Please Debunk This I Dare You!!!!” followed by "Top 10 Reasons People Believe The Earth Is FLAT!" and "BEST FLAT EARTH PROOF 2017 - YOU CANT DENY THIS EVIDENCE."

Searching tragedies can turn up even more disturbing results. If you searched for “Sandy Hook Shooting” in November 2017, one of the top recommended videos was the now removed "BELIEVE YOUR OWN EYES - 9/11 - 'NO PLANES'," followed by videos asserting the Connecticut school shooting and its victims were a hoax.

Chaslot's model found YouTube was recommending conspiracy videos at much higher rates than others for certain search terms.
Chaslot's model found YouTube was recommending conspiracy videos at much higher rates than others for certain search terms.algotransparency.org

Researchers at Harvard conducted their own test and found that the algorithm was more often drawing viewers to extreme content and unfounded right-wing conspiracy theories.

Experts say the conspiracy videos are perfectly positioned to push our buttons and draw us in to consume more of them — signs that YouTube’s algorithm prioritizes, wrote Robert J. Blaskiewicz Jr., a columnist for the Committee for Skeptical Inquiry, a non-profit educational organization that applies scientific analysis to conspiracy theory claims, in an email.

"Conspiracy stories hit our emotional fight or flight triggers,” Blaskiewicz wrote. “And the stories rest on the unstated premise that knowledge of the conspiracy will protect you from being manipulated. This in itself compels people to watch and absorb as much as they can and to revisit videos."

The emotion they provoke is contagious, he said, and they provide ready-made explanations for complex and difficult news events.

By popular demand

YouTube has said it's simply reflecting what users want to see, and videos are chosen based on their individual profile and viewing history.

Publicly, executives have said that the recommendations algorithm drives over 70 percent of content watched on YouTube, and that they’re getting better and better at it all the time.

“Our job is to give the user a steady stream, almost a synthetic or personalized channel," YouTube's chief product officer, Neal Mohan, said at CES, the annual consumer tech conference, in Las Vegas in January.

"Higher watch time means more ad inventory," said Austin Moldow, an equity researcher at Canaccord Genuity, a financial services firm in New York. "More ads, more revenue."

But just because people are willing to watch something doesn’t mean they’re enjoying it. YouTube has to balance protecting its profits with the trust of its users. Fail to walk the line and it can begin to undermine user value, said Kara Swisher, Recode executive editor and MSNBC contributor.

"I think it's a problem not just throughout Youtube, but Google, Facebook, all these companies is that they prioritize growth over anything else. They may not be meaning to do it, but if growth is the goal, then user experience is not the goal,” said Swisher. “Real users, the ones you’re trying to attract, go away. And so it's in all their interests from a business point of view to clean this place up and to have more control over it and there's a moral responsibility to create a platform that isn't being abused by anybody”

“Good advertisers don't wanna be next to these kind of videos either,” she added.

Exploiting the YouTube algorithm is a cottage industry. Video creators who follow the rules can earn a share of advertising revenue. Trends favored by the algorithm are quickly incorporated and uploaded by savvy creators, said Becca Lewis, a researcher at Data & Society, a nonprofit based in New York. Ultimately, if the recommendation engine is promoting conspiracy videos, YouTube incentivizes creating more of them.

But more than just promoting — and prioritizing — misinformation, these digital tabloid channels can also distort democracy. Right before the 2016 U.S. presidential election, Chaslot's research found more than 80 percent of the recommended videos favored Donald Trump. Searching "Trump" lead to pro-Trump video recommendations. Searching "Clinton" raised mainly anti-Clinton video recommendations.

No quick fix

YouTube has taken steps to reduce incentives for some of the worst offenders. Content like mass shootings are not allowed to generate advertising revenue for their creators through YouTube.

However, this doesn’t stop creators from including links for direct donations in their video descriptions, their online merchandise stores, affiliate links for apps or paid mentions within the videos.

In a statement to NBC News, a YouTube spokesperson said “our recommendation system has changed substantially over time and no longer works the way it did five years ago.” While it used to optimize for “watch time,” it now has begun to shift focus to “satisfaction,” balancing watch time with additional data points such as likes, dislikes, shares, and surveys. YouTube has also tweaked its algorithm to better show authoritative news sources, especially for breaking news events, the spokesperson said.

According to YouTube, none of the recommendation system that Chaslot worked on while he was at Google is being used today.

Critics fear that a few updates won’t resolve the core issue, said Tristan Harris, who previously worked as a Google design ethicist and is now a critic of his ex-employer, leading a new group of former technologists out of San Francisco called the Center for Humane Technology.

Using algorithms creates exponential solutions, one for every customer or citizen, but also exponential problems.

“You can't possibly have exponential consequences with exponential responsibility unless you have an exponential amount of human thought to be dedicated to those challenges,” said Harris. “And you can't put that just into an algorithm.”