IE 11 is not supported. For an optimal experience visit our site on another browser.

Disinformation poses an unprecedented threat in 2024 — and the U.S. is less ready than ever

The U.S. presidential election comes at a time of ideal circumstances for disinformation and the people who spread it. 
Photo illustration of former President Donald Trump, the riot at the Capitol, a person receiving a vaccine
The lie that the 2020 election was “stolen” has proven staggeringly effective with Republicans, with a toll that extends to distrust in future elections.Chelsea Stahl / NBC News; Redux; Getty Images

Disinformation poses an unprecedented threat to democracy in the United States in 2024, according to researchers, technologists and political scientists. 

As the presidential election approaches, experts warn that a convergence of events at home and abroad, on traditional and social media — and amid an environment of rising authoritarianism, deep distrust, and political and social unrest — makes the dangers from propaganda, falsehoods and conspiracy theories more dire than ever.

The U.S. presidential election comes during a historic year, with billions of people voting in other elections in more than 50 countries, including in Europe, India, Mexico and South Africa. And it comes at a time of ideal circumstances for disinformation and the people who spread it. 

An increasing number of voters have proven susceptible to disinformation from former President Donald Trump and his allies; artificial intelligence technology is ubiquitous; social media companies have slashed efforts to rein in misinformation on their platforms; and attacks on the work and reputation of academics tracking disinformation have chilled research. 

“On one hand, this should feel like January 2020,” said Claire Wardle, co-director of Brown University’s Information Futures Lab, who studies misinformation and elections, referring to the presidential contenders four years ago. “But after a pandemic, an insurrection and a hardening of belief that the election was stolen, as well as congressional investigations into those of us who work in this field, it feels utterly different.”

The threat disinformation poses falls on a spectrum. Research suggests it has little direct effect on voting choices, but spread by political elites, especially national candidates, it can impact how people make up their minds about issues. It can also provide false evidence for claims with conclusions that threaten democracy or national health, when people are persuaded to take up arms against Congress, for example, or decline vaccination.

Solutions for the enormity of that threat are piecemeal and distant: the revival of local news, the creation of information literacy programs, and the passage of meaningful legislation around social media, among others. 

“Repairing the information environment around the election involves more than just ‘tackling disinformation,’” Wardle said. “And the political violence and aftermath of Jan. 6 showed us what’s at stake.”

Primed for disinformation

The most likely Republican presidential nominee is also the former president — whose time in office was marked by lies told in a failed effort to remain there, falsehoods Trump continues to cling to. Disinformation in service of the lie that the election was “stolen” — disseminated through a network of television, radio and online media — has proven staggeringly effective with Republicans. The toll from that belief extends to distrust in future elections.

On one hand, real consequences have come for spreaders of disinformation. Lies about Covid and the election have reportedly cost some prominent doctors and news anchors their jobs. Millions of dollars have been awarded in civil courts to victims of disinformation. Hundreds of federal criminal convictions have come from the Jan. 6, 2021, riot. And people who allegedly participated in a scheme to overturn President Joe Biden’s victory, including state GOP officials, lawyers and Trump himself, face criminal charges

Whether networks like Fox News or individuals like Rudy Giuliani would be as eager to promote disinformation in 2024 in the face of such consequences remains to be seen. But some predictable players and newcomers in right-wing media have already signaled a willingness to contribute.

“Right-wing media see a demand for content that is pro-Trump and leaning into conspiracy theories,” said A.J. Bauer, an assistant journalism professor at the University of Alabama who studies conservative media. 

In addition to national websites known for disinformation, new local hyperpartisan news organizations might also factor in, Bauer said, with claims acting as fodder for larger national conspiracy theories. 

“These outlets could be looking for examples of hyperlocal voter fraud or intimidation, even if it’s not real,” Bauer said. 

Real stakes

It’s not only voters, but smaller loci of influence made up of state lawmakers, election officials and poll workers moved by disinformation who stand to affect the upcoming election. 

“Election denialism and the misinformation that comes from the far right was in clear view on the federal level” with the 2020 election, said Christina Baal-Owens, executive director of Public Wise, a nonpartisan voting rights organization that tracks local election administration officials who have questioned the legitimacy of the 2020 election. “What was less clear was a threat that was hiding in plain sight, a movement working on the local level.” 

Public Wise has counted more than 200 people who attended, funded or organized the Jan. 6 attempted insurrection and won office in 2022. In Arizona alone, ​​more than half of constituents are represented by state legislators who are professed election deniers.

“We’re looking at a well-organized movement that is working to affect elections across the country,” Baal-Owens said. “They have the ability to determine how people vote, how votes are counted, and whether or not they’re certified.” 

The Capitol breach was the most visible example of political extremism bleeding into real-world violence. But 2020 was also marked by violence, or the threat of it, at state capitols and Covid lockdown protests, a trend experts fear will continue. 

“We’re watching out for voter vigilantism,” said Joan Donovan, an assistant professor of journalism and emerging media studies at Boston University, who studies political violence. “People organizing in Telegram channels and showing up to ballot boxes with guns,” Donovan said, in states that allow it, was an emerging tactic in 2020 and the midterms, by activists who said they were deterring voter fraud. 

“I think that’s going to be the next wave,” Donovan said. 

Old lies, new tech 

Over the weekend, far-right political activist and Trump ally Laura Loomer seeded an early conspiracy theory about the Iowa caucus count. Loomer’s complicated claim of corruption mirrored previous unfounded rumors floated in 2020. 

The falsehoods may remain the same for now, but the technology used to manufacture propaganda has improved. Advances in artificial intelligence, from chatbots to audio and video generators, have made easy-to-use media manipulation tools available to the public. A World Economic Forum survey named misinformation and disinformation from AI as the top global risk over the next two years — ahead of climate change and war. 

Scammers have found success with so-called deepfakes, mostly in manufacturing AI-generated videos of celebrities hawking products like health supplements or cryptocurrency. Even as campaigns begin to use AI in ads and states rush to legislate around them, the much-publicized threat of the technology to elections has yet to materialize. More often, cheap AI is being used to create propaganda, mostly from Trump loyalists.

Content that uses synthetic media from self-described “meme teams,” who serve as volunteers, according to the Trump campaign, is already being shared by Trump on his social media platform, Truth Social. These memes malign other candidates and their spouses, attorneys and judges involved in prosecuting Trump, journalists, and state politicians and election officials deemed enemies of the Trump camp.

“Granted it’s hokey and not believable in any way, shape or form, but it’s only a matter of time until something works,” said Ben Decker, the chief executive of Memetica, a digital investigations company. “The disinformation narratives, the meme wars, they’re back. That content is going to overpopulate certain parts of the public square.” 

The effect on the wider world is clear, Decker said: “Harassment of public officials, members of the media and civil society groups is going to run rampant.”

A potential greater threat lies in generative AI tools’ ability to personalize misinformation, making it harder for social media platforms to moderate because it appears authentic, said Laura Edelson, an assistant professor at Northeastern University and co-leader of Cybersecurity for Democracy, who studies political misinformation.

“It’s going to be a lot harder this cycle as people are washing misinformation through generative AI tools,” Edelson said. “Misinformation will be more effective inside insular communities and harder to detect. Platforms need to be building new tools.”

Instead, Edelson and others say, platforms are cutting the teams tasked with moderation to the bone. Since 2021, the largest social media companies have reportedly deprioritized efforts to guard against viral falsehoods, tech critics said. 

Elon Musk’s X has led the way as social media platforms including Meta and YouTube have retreated from enforcement and policy and slashed content moderators and trust and safety teams, said Rose Lang-Maso, campaign manager at Free Press, a digital civil rights organization. 

“Without policies in place that moderate for content and without enough content moderators to actually do the moderating, it makes it more possible for bad actors to increase abuse online and offline,” Lang-Maso said. “Platforms are really abdicating the responsibility to users.”

Meta, YouTube and X have denied reports that they are ill-prepared to prevent the spread of election disinformation.

“Content misleading voters on how to vote or encouraging interference in the democratic process is prohibited on YouTube,” YouTube spokesperson Ivy Choi said in a statement to NBC News. “We continue to heavily invest in the policies and systems that connect people to high-quality content, and our commitment to supporting the 2024 election is steadfast.”

A spokesperson for Meta declined to comment but shared a news release about the company's plans for the 2024 elections.

Who’s watching?

The first challenge of combating disinformation in the 2024 cycle might be in identifying it. 

The social media space has become fragmented with the ascendancy of alternatives including Substack, Telegram, Threads and Rumble as viable spaces for political actors and extreme content. And a pressure campaign by conservative activists may affect how many trained eyes are available to be on the lookout.

Republican politicians and activists responded to the wave of disinformation in 2020 by targeting the researchers, universities, tech companies and journalists who pointed it out. Using social media campaigns, the courts and congressional committees, far-right critics have aired unfounded accusations that efforts to curtail disinformation around the election and the pandemic were part of a plot to censor conservatives. Some researchers said those partisan campaigns, which have included burdensome information requests and threats of reputational and legal harm to institutions, have had a chilling effect on new research going into 2024. 

Watchers are further challenged by the lack of transparency from social media companies. So-called black boxes surround the algorithms that serve up content, and the inability to see what is happening on the platforms in real time has only gotten worse

“We’re flying blind,” said Mike Caulfield, a research scientist at the University of Washington’s Center for an Informed Public who studies election rumors.

The delay in catching false narratives early is essentially giving disinformation a head start, Caulfield said, and could mean a delay in fact-checking efforts and context from journalists. 

Risks to national security, safety and voting rights aside, the larger threat from the coming wave of disinformation might be in widened partisan divides and weakened public trust.  

“The direct effect of disinformation might not be as high as we think it is,” said Joshua Tucker, co-director of New York University’s Center for Social Media and Politics, referring to voting preferences. “But the indirect effect is people losing confidence in journalism, losing confidence that there’s an objective truth out there, and believing that anything could be disinformation.”