Artificial intelligence helped predict turnout in last year's Mississippi election. One group then used the technology to transcribe, summarize and synthesize audio recordings of door knockers' interactions with voters to create a report on what they heard in each county.
Another group recently compared messages translated by humans and AI into six Asian languages and found that they were all equally effective. The Democratic firm tested four versions of voice-over ads (two spoken by humans and two by AI) and found that the AI male voice was just as convincing as its human counterpart. (female voices outperformed their AI counterparts).
The age of artificial intelligence in electioneering has officially arrived, but the much-anticipated and feared technology remains confined to the periphery of American elections.
With less than six months until the 2024 election, the political use of AI, both as a constructive communication tool and a means to spread dangerous disinformation, has become more theoretical than transformative. Masu. The Biden campaign has said it is strictly limiting its use of generative AI, which uses prompts to create text, audio, and images, to productivity and data analysis tools, while the Trump campaign has not used the technology at all. He said he did not.
“This is a dog that didn't bark,” said Dmitri Melhorn, a political adviser to Reid Hoffman, one of the Democratic Party's most generous donors. “We haven't found anything cool with generative AI that we can invest in to actually win elections this year.”
Hoffman is far from an AI skeptic. He previously served on the board of Open AI and recently gave an “interview” with an AI version of himself. But for now, the only political applications of technology worthy of Hoffman's money and attention are what Melhorn calls “unappealing productivity tools.”
Eric Wilson, a Republican digital strategist who runs a campaign technology investment fund, agreed. “AI is changing the way campaigns are run, and it’s the most boring, mundane way you can imagine,” he said.
Technologists and political activists have little doubt that AI will ultimately have the power to transform the political arena. A new report from Higher Ground Labs, which invests in political tech companies to benefit progressive causes and candidates, says that while AI is still in the “experimental stage,” it also represents a “generational opportunity” for the Democratic Party moving forward.
For now, the Democratic National Committee is conducting more modest experiments, including using AI to spot unusual patterns in voter registration records and find notable voter deletions or additions.
Janine Abrams McLean, president of the nonprofit Fair Count, which led the AI experiment in Mississippi, said the pilot project involved AI-transcription of 120 audio notes recorded after meetings with voters. The team then used the AI tool Crowd to map out geographic differences in opinion based on the interlocutors' opinions of the interaction.
“When we used this AI model to synthesize the voice notes, we found that the emotions emanating from Coahoma County were much more vigorous and indicative of people planning to vote,” she said, “whereas we didn't hear the same emotions in Hattiesburg.”
Sure enough, turnout was down in the Hattiesburg area, she said.
Larry Huynh, who oversaw the AI-narrated ads, said he was surprised by how well the AI voices performed: He and most of his colleagues at Democratic consulting firm Trilogy Interactive found the male AI voice “the most awkward,” but testing showed it to be convincing.
“You don't necessarily need a human voice to create an effective ad,” says Huynh, who, as the current president of the American Association of Political Consultants, thinks deeply about the ethics and economics of AI technology. Still, he adds, tweaking models to create new AI voices is just as laborious and expensive as hiring a voice actor.
“I can't believe it, but I actually saved money,” he said.
Democrats and Republicans are also racing to defend against the threat of a new kind of political dark art: AI-powered disinformation in the form of deepfakes and other false or misleading content. Before January's New Hampshire primary, AI-generated robocalls imitating President Biden's voice in an attempt to suppress votes led to new federal rules banning such calls.
For regulators, lawmakers and election officials, the incident highlighted the disadvantage they are in dealing with malicious newbies who can act more quickly and anonymously. The fake Biden robocall was made by a New Orleans magician who holds world records for bending forks and escaping from a straitjacket. He said he used an off-the-shelf AI product that took 20 minutes and cost $1.
“My concern is that it was easy for an average person who doesn't have a lot of experience with AI or technology to make these calls,” New Hampshire Secretary of State David Scanlan said at a Senate committee hearing on the role of AI in this spring's elections.
AI is “like a match with gasoline,” said Rashad Robinson, who helped write an Aspen Institute report on the information chaos that followed the 2020 race.
Robinson, president of racial justice group Color of Change, outlined such a “nightmare” scenario and said it would be nearly impossible to prevent. “You might hear a local pastor call 3,000 people and say, 'Don't come to the polls because there are armed white people here,'” he said. “There's no real accountability and no real consequences for the people who are building the tools and the platforms that make this possible.”
The possibility of a similar last-minute AI disruption is keeping New Mexico Secretary of State Maggie Toulouse Oliver up at night. Ahead of the state's primary, she ran an ad campaign warning voters that “AI won't be very visible this election season” and advising them to “when in doubt, research it.”
“In elections, we are often behind the eight-ball,” she said, adding, “and now we have to deal with this new wave of activity.” .
AI is already being used to mislead election campaigns abroad: In India, an AI version of Prime Minister Narendra Modi addressed voters by name on WhatsApp; in Taiwan, an AI version of outgoing President Tsai Ing-wen appeared to encourage investment in cryptocurrencies; and in Pakistan and Indonesia, dead and imprisoned politicians have reappeared as AI avatars to appeal to voters.
So far, most of the fakes have been easily detected, but the Microsoft Threat Analysis Center, which studies disinformation, warned in a recent report that while deepfake tools are becoming more sophisticated, any that could sway a US election “are probably not yet on the market.”
In the 2024 election, many candidates are taking a cautious approach to artificial intelligence, or even not working on artificial intelligence at all.
The Trump campaign “has no involvement in or use of AI,” spokesman Steven Chun said in a statement. But the campaign, “like many other campaigns across the country, uses proprietary algorithmic tools to deliver our emails more efficiently and to prevent false information from entering our registration list,” he said. said.
But the Trump campaign is wary of AI, even as supporters use it to create deepfake images of the former president surrounded by black voters, which Trump is actively courting. have not been able to prevent it.
The Biden campaign says it is severely restricting the use of AI, saying, “Currently, the campaign is only permitted to use generative AI for productivity tools such as data analysis and industry-standard coding assistants.” said campaign spokeswoman Mia Ehrenberg.
A senior Biden official, speaking on condition of anonymity to discuss internal operations, said AI is most often deployed in campaigns to find efficiencies behind the scenes, such as testing which marketing messages lead to clicks and other forms of engagement. He said that This process is known. as conversation marketing. “This is not science fiction,” the official added.
Artificial intelligence has taken a central place in the zeitgeist, and some campaigns have found that simply deploying this technology can help draw attention to their messages.
Last year, a wave of coverage ensued after the National Republican Congressional Committee showed AI-generated images of national parks as migrant tent cities. In response to a recording released by the former president's daughter-in-law, Lara Trump (her song was called “Anything is Possible”), the Democratic National Committee used AI to identify Trump and He created a diss track mocking Republican funding. and attracted the attention of celebrity gossip site TMZ.
But digital political strategists are still figuring out how well AI tools work in practice. Many involve mundane data-crunching tasks, but some involve more novel ideas, like an AI-powered eye-contact tool that keeps people in the video from looking away, potentially streamlining the recording of scripted videos. Because the White House has blocked the release of the audio of Biden's interview with the special counsel, Republicans could, for dramatic effect, use an AI-generated track of Biden reading the transcript instead.
“I don't know anyone who hasn't tried pre-creating content,” Democratic digital strategist Kenneth Pennington said of using generative AI to create early drafts of fundraising messages, “but I don't know many people who have found that process useful either.”
In Pennsylvania, a congressional candidate used an AI-powered phone banking service to conduct interactive phone conversations with thousands of voters.
“I share the same serious concerns as everyone else about the potential for misuse of AI in politics and beyond,” candidate Shamaine Daniels said on Facebook, “but we also need to understand and embrace the possibilities this technology brings.”
She finished in third place by a large margin.