IE 11 is not supported. For an optimal experience visit our site on another browser.

Technology group hopes to help Democrats win with AI-generated ads and emails

Tech for Campaigns said early tests showed generative AI combined with human oversight helped create effective outreach material.
Get more newsLiveon

A group of Democrat-supporting technologists is aiming to power this year’s election outreach efforts with the latest advances in artificial intelligence.

Tech for Campaigns, an organization that uses digital marketing and data to help Democratic candidates win elections, is now using AI to create digital ads and fundraising emails, all of which they say are managed and vetted by humans, or “AI-aided.” The tools are part of the group’s broader campaign to help cash-strapped campaigns save money and win tight races.

Generative AI burst onto the scene in late 2022 with OpenAI’s release of ChatGPT and has evolved into a wide variety of tools that can generate everything from text and images to human voices and video based on simple prompts. This technology is distinct from other AI technology that previously focused on manipulating existing media, often known as deepfake technology.

Jessica Alter, the organization’s co-founder, said its model has already paid dividends. The organization started by using off-the-shelf generative AI tools such as Google’s Bard and ChatGPT. Now, the organization is rolling out its own suite of AI-enabled tools called the “TFC Learning Engine,” which it plans to share with campaigns in the current election cycle.

She said even the early tests showed promise.

“In Virginia in 2023, we did an experiment across 14 campaigns. We tested AI-aided emails, which again means that we’re looking at them and editing them,” Alter said. “And what we were really testing for was, ‘Hey, can this help the campaigns on the productivity side of things?’”

She said the trial run found that AI-aided emails generated between three and four times more fundraising dollars per work hour than ones that were solely written by humans.

“It can help generate ad ideas. It could help generate even regular marketing material like flyers and signs,” Alter said. “We know of candidates that used it to come up with new slogans. And as you get deeper and deeper, if you have data, it can help you analyze that data and pull out insight. So it’s becoming more and more powerful.”

Some tech companies are already working on their own AI-generated ad offerings, including Facebook, which announced a feature in October that can write text and alter images. Google rolled out a similar product in May.

The move comes at a tumultuous time for AI in politics.

OpenAI announced in January that it would no longer allow people to build applications for political campaigning or to create chatbots that would pretend to be real candidates.

The Federal Communications Commission quickly issued a ruling that made AI-generated robocalls illegal, following a call impersonating President Joe Biden that encouraged New Hampshire voters not to vote in that state’s presidential primary.

As many as 20 tech companies involved in AI, including Meta and Google, recently signed a pledge to take measures to prevent their algorithms from being used for election interference.

And earlier this month, House leaders launched a bipartisan task force dealing specifically with AI. Among their objectives: ensuring legal accountability for election interference caused by AI.

Lawrence Norden, senior director of the Elections and Government Program at the Brennan Center for Justice, said that he expects more campaigns to start using similar AI tools to manage and synthesize information more efficiently in the same way that private sector organizations are starting to do.

Still, he warns any campaign thinking of using AI to be cautious, even if it is attempting to use it for legitimate reasons.

“AI hallucinates. It can provide wrong information and make mistakes. It spreads disinformation. It sometimes compounds biases based on the training material that it receives,” Norden said. “And you have to be ready for that failure.”

Another potential pitfall: The language model could make promises that the candidate is not willing to deliver on. Or it could potentially violate campaign finance laws by making promises in exchange for money.

Alter said her organization is aware of the potential risks and is ready to have a nuanced conversation about potential regulation.

“I absolutely agree that there are malicious uses of AI that can be used in elections, and I am worried about that. But I’m also worried that we’re not having a balanced conversation. And that the [Democratic] Party is going to get left behind on AI because they are leading from a place of fear.”

Brad Parscale, Trump’s former campaign manager, has also founded a digital firm that promises to leverage AI tools. In a 2022 press release, he touted it as a “revolutionary new eco-system” than can benefit both commercial and political campaigns.