fbpx

Will AI Deepfakes Threaten Election Integrity?

Fake news article on smartphone
Fake news article on smartphone | Image by McLittle Stock/Shutterstock

Advancements in certain technologies are prompting concerns over the spread of misinformation ahead of the presidential election in 2024.

With technology like artificial intelligence rapidly improving, powerful tools are increasingly available to anyone wishing to spread misinformation. In addition, social media companies are reportedly spending less effort on countering alleged false narratives as they redirect their focus. The fear is that it will be challenging to stop certain actors from swaying elections, some experts claim, according to the Associated Press.

Oren Etzioni, an artificial intelligence expert and professor emeritus at the University of Washington, said he anticipates a “tsunami of misinformation.”

“I can’t prove that. I hope to be proven wrong. But the ingredients are there, and I am completely terrified,” said Etzioni.

Of particular concern are AI deepfakes, realistic fake images, and audio, which are often difficult to distinguish from real photographs and recordings. Theoretically, these deepfakes could be used in experimental campaign ads or drive false narratives on social media.

In fact, Larry Norden, senior director of the elections and government program at the Brennan Center for Justice, said high-tech fakes have already impacted elections globally. In Slovakia, audio generated by artificial intelligence was used to impersonate a candidate discussing plans to raise the price of beer and rig the election. Despite attempts by fact-checkers to disprove the recording, they were shared widely across social media.

The Federal Election Commission and Congress are considering regulations to tackle political AI deepfakes. Some states, like Texas, have already adopted legislation to help address the growing issue.

Support our non-profit journalism

Submit a Comment

Your email address will not be published. Required fields are marked *

Continue reading on the app
Expand article