IE 11 is not supported. For an optimal experience visit our site on another browser.

YouTube says it wants 'discussion' of election results, even when it's been debunked

Compared to Facebook and Twitter, YouTube has staked out a position that is less aggressive in handling videos that contain election misinformation.
Image: Democratic 2020 U.S. presidential nominee Joe Biden's election rally, after news media announced that he has won the 2020 U.S. presidential election, in Wilmington, Delaware
Joe Biden speaks in Wilmington, Del., on Nov. 7, 2020.Jonathan Ernst / Reuters

YouTube is facing growing criticism for allowing election misinformation after it decided not to remove or individually fact-check videos that spread unfounded conspiracy theories alleging voter fraud.

While all internet platforms are struggling to contain the volume of misinformation since voting ended last week — and all have been criticized to some degree by researchers for their handling of the situation — YouTube has staked out a position that is less aggressive than its social media competitors, most notably Facebook and Twitter.

YouTube said before the election that it wouldn’t allow videos that encourage “interference in the democratic process,” but now, as state officials are working to certify vote tallies, the company said it wants to give users room for “discussion of election results,” even when that discussion is based on debunked information.

Somewhere in between those two policies it has decided to leave up videos challenging Joe Biden’s election, and some have received millions of views.

“Is YouTube unable to contend with this material, meaning they lack resources? Or is it a lack of will?” asked Sarah Roberts, co-director of UCLA’s Center for Critical Internet Inquiry and an associate professor of information studies.

“I think one of those is probably more damning than the other, but they both have the same outcome of allowing propaganda material masquerading as news being distributed on their platform at a critical juncture for the American political cycle,” Roberts said.

The tension came to a head this week over videos by One America News Network, a conservative media outlet that has more than 1 million subscribers on YouTube and that has repeatedly posted videos claiming without evidence that the election was stolen from President Donald Trump.

YouTube, which is owned by Google and wields enormous influence in political debates, defended its decision. “Like other companies, we're allowing these videos because discussion of election results & the process of counting votes is allowed on YT,” the company said Thursday in a series of unsigned tweets.

The videos were not being surfaced in search results or recommended to users “in any prominent way,” YouTube said.

But other companies haven’t always been so loose. Twitter and Facebook, for example, have put some videos trying to discredit the election results behind a barrier and warning label. YouTube has a label, but it’s less obtrusive and less specific.

Unlike the other companies, YouTube puts a label on all election videos regardless of how factual they are — a decision that “cheapens the impact,” Roberts said. YouTube said the label has been seen billions of times.

“I’d like for YouTube to clarify its role, because this isn’t like a public library,” Roberts said. YouTube CEO Susan Wojcicki has compared YouTube to a library, but Roberts said that “at a public library, you’d have the expertise of trained staff to help you make sense of different sources, and that is something YouTube is willfully not doing.”

Before the election, YouTube had set expectations that it would remove at least some videos with election misinformation. In an Oct. 27 blog post, Leslie Miller, YouTube’s vice president for government affairs and public policy, said, “under our voter suppression policy, we remove content falsely claiming that mail-in ballots have been manipulated to change the results of an election.”

Miller also said then that YouTube’s community guidelines “do not allow misleading claims about voting or content that encourages interference in the democratic process.”

On Friday, YouTube said it had removed several videos, including several election-related livestreams and a video that contained technically manipulated audio of a Detroit poll worker.

Some of the videos from One America initially had ads running before them, allowing the company to profit from the videos, until a reporter from Bloomberg News reported the ads as an apparent violation of Google’s rules.

“YouTube saw the inevitable writing on the wall that its platform would be used to spread false claims of election victory, and it shrugged,” Evelyn Douek, a Harvard Law School lecturer who studies content moderation, told Bloomberg News.

An election-related video from a YouTube channel called the Next News Network had 1.9 million views as of Friday, despite a false claim in the video that Biden had “lost” his status as president-elect. Other outlets including USA Today debunked the claim in the video.

The same video appears on Facebook but with a fact-check label that is much more noticeable: “False Information: The same information was checked in another post by independent fact-checkers.”

No one at YouTube was available for an interview about the company’s handling of election-related misinformation, YouTube spokesperson Ivy Choi said. She said the company was promoting authoritative content from established news organizations in search results and recommendations. She declined to provide a list of organizations YouTube considers authoritative.

“The most popular videos about the election are from authoritative news organizations,” YouTube said in its Twitter thread. It said that on average, 88 percent of the videos in top-10 results in the U.S. come from highly authoritative sources when people search for election-related content, though YouTube declined to provide a detailed breakdown.

By Friday, a video from One America alleging without evidence that “Trump won” had received more than 371,000 views. YouTube has its usual label below the video, saying: “The AP has called the Presidential race for Joe Biden. See more on Google.”

YouTube’s decision to eliminate certain videos from search results has angered some video creators. Christina Robb, a host on One America, complained last week that YouTube was “hiding my videos showing rampant voter fraud.”

But the limited response from YouTube threatens to haunt the company’s reputation, as social media’s handling of false information gets continued scrutiny from lawmakers, academic researchers and journalists.

“There’s a good chance YouTube’s handling of this goes in the first sentence of every story about how social networks handled the 2020 election for the next several years,” Casey Newton, a journalist who writes the technology newsletter Platformer, said in a tweet.

In another example of widespread election misinformation, a video of a woman inspecting her ballot for a watermark, tied to a conspiracy theory involving an alleged military sting operation, has been viewed more than 624,000 times on YouTube.

Jason Kint, CEO of Digital Content Next, a trade group for traditional media companies, including NBCUniversal (which owns NBC News), and a prominent critic of technology companies, posted screenshots of Google searches for the phrase “Biden loses.” The top results were YouTube videos with false conspiracy theories, the screenshots showed.

“Antitrust harms + self-dealing + YouTube toxic garbage wrapped up in a bow,” Kint said in a tweet.

Google’s public liaison of search, Danny Sullivan, responded that the company would work to improve but added that the search in question was very uncommon.