How AI is changing the 2024 election

Photo by OLIVIER MORIN/AFP via Getty Images
This illustration photograph taken in Helsinki on June 12, 2023, shows an AI (Artificial Intelligence) logo blended with four fake Twitter accounts bearing profile pictures apparently generated by Artificial Intelligence software. (Photo by OLIVIER MORIN/AFP via Getty Images)

As the generative artificial intelligence (AI) industry booms, the 2024 election cycle is shaping up to be a watershed moment for the technology’s role in political campaigns.

The proliferation of AI — a technology that can create text, image and video — raises concerns about the spread of misinformation and how voters will react to artificially generated content in the politically polarized environment.

Already, the presidential campaigns for former President Trump and Florida Gov. Ron DeSantis (R) have produced high-profile videos with AI.

Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights, said the proliferation of the AI systems available to the public, awareness of how simple it is to use them and the “erosion of the sense that creating things like deepfakes is something that good, honest people would never do” will make 2024 a “significant turning point” for how AI is used in campaigns.

“I think now, increasingly, there’s an attitude that, ‘Well, it’s just the way it goes, you can’t tell what’s true anymore,’” Barrett said.

The use of AI-generated campaign videos is already becoming more normalized in the Republican primary.

After DeSantis announced his campaign during a Twitter Spaces conversation with company CEO Elon Musk, Trump posted a deepfake video — which is a digital representation made from AI that fabricates realistic-looking images and sounds — parodying the announcement on Truth Social. Donald Trump Jr. posted a deepfake video of DeSantis edited into a scene from the television show “The Office,” and the former president has shared AI images of himself.

Last week, DeSantis’s campaign released an ad that used seemingly AI-produced images of Trump embracing Anthony Fauci, the former director of the National Institute of Allergy and Infectious Diseases.  

“If you proposed that 10 years ago, I think people would have said, ‘That’s crazy, that will just backfire,’” Barrett said. “But today, it just happens as if it’s normal.”

Critics noted that DeSantis’s use of the generated photo of Trump and Fauci was deceptive because the video does not disclose the use of AI technology.

“Using AI to create an ominous background or strange pictures, that’s not categorically different than what advertising has long been,” said Robert Weissman, president of the progressive consumer rights advocacy group Public Citizen. “It doesn’t involve any deception of voters.”

“[The DeSantis ad] is fundamentally deceptive,” he said. “That’s the big worry that voters will be tricked into believing things are true that are not.”

Someone familiar with DeSantis’s operation noted that the governor’s presidential campaign was not the only campaign using AI in videos.

“This was not an ad, it was a social media post,” the person familiar with the operation said. “If the Trump team is upset about this, I’d ask them why they have been continuously posting fake images and false talking points to smear the governor.”

While proponents of AI acknowledge the risks of the technology, they argue it will eventually play a consequential role in campaigning.

“I believe there’s going to be new tools that streamline content creation and deployment, and likely tools that help with data-intensive tasks like understanding voter sentiment,” said Mike Nellis, founder and CEO of the progressive agency Authentic.

Nellis has teamed up with Higher Ground Labs to establish Quiller.ai, which is an AI tool that has the ability to write and send campaign fundraising emails.

“At the end of the day, Quiller is going to help us write better content faster,” Nellis told The Hill.  “What happens on a lot of campaigns is they hire young people, teach them to write fundraising emails, and then ask them to write hundreds more, and it’s not sustainable. Tools like Quiller get us to a better place and it improves the efficiency of our campaigns.”

As generative AI text and video become more common — and increasingly difficult to discern as the generated content appears more plausible — there’s also a concern that voters will become more skeptical about all content AI generates.

Sarah Kreps, director of the Cornell Tech Policy Institute, said people may start to either “assume that nothing is true” or “just believe their partisan cues.”

“Neither one of those is really helpful for democracy. If you don’t believe anything, this whole pillar of trust we really on for democracy is eroded,” Kreps said.

ChatGPT, which is OpenAI’s AI-powered chatbot, burst onto the scene with an exponential rise in use since its November launch, along with rival products like Google’s Bard chatbot and image and video-based tools. These products have the administration and Congress scrambling to consider how to address the industry while staying competitive on a global scale.

But as Congress mulls regulation, between scheduled Senate briefings and a series of hearings, the industry has been largely left to create the rules of the road. On the campaign front, the rise of AI-generated content is magnifying the already prevalent concerns of election misinformation spreading on social media.

Meta, the parent company of Facebook and Instagram, released a blog post in January 2020 stating it would remove “misleading manipulated media” that meets certain criteria, including content that is the “product of artificial intelligence or machine learning” that “replaces or superimposes content onto a video, making it appear to be authentic.”

Ultimately, though, Barrett said the burden of deciphering what is AI-generated or not will fall on voters.

“This kind of stuff will be disseminated, even if it is restricted in some way; it’ll probably be out in the world for a while before it is restricted or labeled, and people have to be wary,” he said.

Others point out that it’s still too difficult to predict how AI will be integrated into campaigns and other organizations.

“I think the real story is that new technologies should integrate into business at a deliberate and careful pace, and that the inappropriate/almost immoral uses are the ones that are going to get all the attention in the first inning, but it’s a long game and most of the productive useful integrations will evolve more slowly and hardly even be noticed,” said Nick Everhart, a Republican political consultant in Ohio and president of Content Creative Media.

Weissman noted that Public Citizen has asked the Federal Election Commission to issue a rule to the extent of its authority to prohibit the use of deceptive deepfakes.

“We think that the agency has authority as it regards candidates but not political committees or others,” Weissman said. “That would be good, but it’s not enough.”

However, it remains unclear how quickly campaigns will adopt AI technology this cycle.

 “A lot of people are saying this is going to be the AI election,” Nellis said. “I’m not entirely sure that’s true. The smart and innovative campaigns will embrace AI, but a lot of campaigns are often slow to adopt new and emerging technology. I think 2026 will be the real AI election.”

Tags 2024 GOP presidential primary 2024 presidential election Artificial Intelligence Donald Trump Donald Trump Jr. Elon Musk Ron DeSantis

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Most Popular

Load more