Experts warn of AI’s impact on 2024 US election as deep fakes go mainstream and social media: Know More
Experts warn of AI’s impact on 2024 US Election
Several experts have expressed concern that the upcoming presidential election could increase the transmission of false information.
Nearly three years after rioters stormed the U.S. Capitol, fake election conspiracy rumors about suitcases of ballots, late-night ballot dumps, and dead individuals voting persist on social media and television news. It will undoubtedly get worse in the presidential race, say experts. The controls that tried to stop the fake claims last time are crumbling, while the tools and mechanisms that distribute them are strengthening.
Former President Donald Trump has encouraged many Americans to believe U.S. elections are unreliable. 57% of Republicans feel Joe Biden was not lawfully elected president.
Generative artificial intelligence algorithms have made spreading misinformation that could affect elections cheaper and easier. After substantially correcting the record, social media corporations have switched their objectives.
Election photos and videos have been manipulated for years, but 2024 will be the first U.S. presidential election using advanced AI technologies that can create convincing fakes in seconds.
Deepfake pictures, videos, and audio appear in experimental presidential campaign advertising. Etzioni warned that more malicious versions could spread undetected and confuse people days before elections.
He remarked, You could see a political candidate like President Biden being rushed to a hospital.” You may watch a candidate say things they never stated.”
The Federal Election Commission and Republicans and Democrats in Congress are considering regulating the technology but haven’t decided. Few states have regulations labeling deepfakes or forbidding candidate misrepresentation. YouTube and Meta, which owns Facebook and Instagram, classify AI. Whether they can routinely catch violators is unknown. About a year ago, Elon Musk bought Twitter, fired its leadership, dismantled some essential features, and reshaped it into X.
He has since disrupted its verification system, exposing public officials to impersonators. He eliminated the teams that fought misinformation on the network, leaving users to monitor. He reinstated banned conspiracy theorists and radicals’ accounts. YouTube announced in June that it would stop censoring video that erroneously implies the 2020 election or prior U.S. elections were marred by “widespread fraud, errors or glitches.” According to the platform, the policy protects the right to “openly debate political ideas, even those that are controversial or based on disproven assumptions.”
Since 2020, X, Meta, and YouTube have fired thousands of staff and contractors, including content moderators. Meta’s website says it has 40,000 safety and security employees. It often shuts down discord-sowing bogus social media accounts. YouTube spokesman Ivy Choi said recommendation and information panels deliver accurate election news.
TikTok and other unregulated platforms like Telegram, Truth Social, and Gab have created more online information silos where false claims can propagate. WhatsApp and WeChat use private discussions, making misinformation hard to spot. Misinformation specialists believe that Trump’s Republican presidential primary lead may increase election misinformation and lead to election vigilantism or violence.