![]() ![]() “There’s no putting the generative-AI genie back in the bottle,” Farid said. And given computing power’s long trend of exponential growth, it’s only a matter of time before an AI text-to-video platform’s movie magic alone will be able to trick us. While computers alone can’t produce an eye-fooling video yet, someone with the requisite editing prowess could polish AI’s work product enough to create something that seemed real in a video viewed on a phone or other small screen, Farid said. “It’s shocking how good it is,” said Farid. And that fake can be directed to say anything. Today, companies like Respeecher only need to analyze a few minutes of a person’s voice to generate a convincing sonic replica. Replicating someone’s voice in a believable manner was nearly impossible a few years ago - even the best impressionists could only get close. At first view, nothing may seem amiss during the five seconds the collage appears under the heading “REAL LIFE TRUMP” only when pausing the ad does it become clear how unnatural their hands looked or how, in one image, Trump appeared to be kissing Fauci on his eye. Anthony Fauci that Never Back Down showed in an attack ad. Widely available generative AI photos and videos today can produce images that appear real at a glance but fall into the uncanny valley upon closer inspection, like the pictures of Trump hugging Dr. Campaign attack ads have long used the most unflattering pictures of their opponents, often rendered in more menacing black and white, to make them look like shifty-eyed liars.īut generative AI will supercharge the ability of campaigns, and their rogue supporters, to produce believable fakes. And right-wing pundits recently claimed, falsely, that Biden fell asleep during a memorial for the victims, pointing to a low-quality video of Biden looking down for a few seconds. A video of Biden visiting Maui after the devastating fire there was doctored to add chants cursing out the president. Before he dropped out of the GOP presidential nomination race, a super PAC supporting Miami Mayor Francis Suarez posted videos of “ AI Francis Suarez” that touted the accomplishments of “my namesake, conservative Miami Mayor Francis Suarez.”Įditing media to mislead voters is not new and doesn’t require AI. Ron DeSantis’ presidential campaign, used AI to get a fake Trump to read a post made on Truth Social, making it sound like he had called into a radio show. ![]() Never Back Down, a PAC backing Florida Gov. Already deployedĬampaigns have already deployed deepfake technology in less malicious ways in the GOP presidential battle. ![]() A United Nations adviser recently told Fox News that a deepfake October surprise was his deepest worry. Testifying before a Senate Judiciary subcommittee in May, OpenAI CEO Sam Altman called AI’s capabilities to generate disinformation personalized to the targets, one-by-one, one of his gravest concerns. We know the supporters are going to do it. “We know that we have state-sponsored actors interfering, we know the campaigns are going to play dirty tricks. “Campaigns are high stakes,” said Hany Farid, a generative AI expert at the University of California, Berkeley. AI developers warn that the technology’s rapid development and widespread deployment risks ushering in an epistemological dystopia that would undermine the foundations of representative democracy. And it’s another question, and a doubtful one at that, whether such evidence of some audio’s provenance will matter to partisan voters so ready to reject any datapoint that doesn’t conform to their worldviews.ĭeepfake audio, authentic-sounding but false recordings built from short snippets of a subject talking, have become so realistic that they can fool your own mother, presenting painfully obvious potential for underhanded political tactics. And while generative AI experts say they will most likely be able to detect charlatans, it would be impossible to prove a recording is real. Whether such a clip is real or the work of new, startlingly realistic generative AI models, the affected politician will call it a fake and evidence of the other side’s willingness to lie, cheat and steal their way to the White House. Or maybe the uproar will be over audio of former President Donald Trump saying something that his supporters find disqualifying. It may arrive in journalists’ inboxes from an anonymous whistleblower, or just go viral on social media. ![]() The audio, a bit grainy and muffled as if it was recorded from a phone in someone’s pocket, will have the 80-year-old sounding confused, perhaps seeming to forget that he’s president, before turning murderously angry. At some point in the months leading up to the 2024 election, a tape will leak that will confirm voters’ worst fears about President Joe Biden. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |