AI-generated robocall impersonates Biden in apparent bid to suppress votes
The New Hampshire attorney generalâs office said it is investigating reports of an apparent robocall that used artificial intelligence (AI) to mimic US President Joe Bidenâs voice and discourage voters in the state from coming to the polls during Tuesdayâs primary election.
Attorney general John Formella said the recorded message, which was sent to multiple voters on Sunday, appears to be an illegal bid to disrupt and suppress voting.
He said voters âshould disregard the contents of this message entirelyâ.
A recording of the call reviewed by the Associated Press generates a voice similar to Mr Bidenâs that includes his often-used phrase: âWhat a bunch of malarkey.â
It then tells the listener to âsave your vote for the November electionâ.
âVoting this Tuesday only enables the Republicans in their quest to elect Donald Trump again,â the voice mimicking Mr Biden says.
âYour vote makes a difference in November, not this Tuesday.â
It is not true that voting in Tuesdayâs primary precludes voters from casting a ballot in Novemberâs general election.
Mr Biden is not campaigning in New Hampshire and his name will not appear on Tuesdayâs primary ballot after he elevated South Carolina to the lead-off position for the Democratic primaries, but his allies are running a write-in campaign for him in the state.
It is not known who is behind the calls, though they falsely showed up to recipients as coming from the personal mobile phone number of Kathy Sullivan, a former state Democratic Party chairwoman who helps run Granite for America, a super-PAC supporting the Biden write-in campaign.
Ms Sullivan said she alerted law enforcement and issued a complaint to the attorney general after multiple voters in the state reported receiving the call on Sunday night.
âThis call links back to my personal cell phone number without my permission,â she said in a statement.
âIt is outright election interference, and clearly an attempt to harass me and other New Hampshire voters who are planning to write-in Joe Biden on Tuesday.â
It was unclear how many people received the call but a spokesperson for Ms Sullivan said she heard from at least a dozen people who got it on Sunday night.
The attorney generalâs office said anyone who has received the call should email the state Justice Departmentâs election law unit.
Mr Bidenâs campaign manager, Julie Chavez Rodriguez, said in a statement that the campaign is âactively discussing additional actions to take immediatelyâ.
âSpreading disinformation to suppress voting and deliberately undermine free and fair elections will not stand, and fighting back against any attempt to undermine our democracy will continue to be a top priority for this campaign,â she said.
The apparent attempt at voter suppression using rapidly advancing generative AI technology is one example of what experts warn will make 2024 a year of unprecedented election disinformation around the world.
Generative AI deepfakes already have appeared in campaign ads in the 2024 presidential race, and the technology has been misused to spread misinformation in multiple elections across the globe over the past year, from Slovakia to Indonesia and Taiwan.
âWe have been concerned that generative AI would be weaponised in the upcoming election and we are seeing what is surely a sign of things to come,â said Hany Farid, an expert in digital forensics at the University of California who reviewed the call recording and confirmed it is a relatively low-quality AI fake.
Katie Dolan, a spokeswoman for the campaign of Dean Phillips of Minnesota, who is challenging Mr Biden in the Democratic primary, said Mr Phillipsâ team was not involved and only found out about the deepfake attempt when a reporter called seeking comment.
âAny effort to discourage voters is disgraceful and an unacceptable affront to democracy,â Ms Dolan said in a statement.
âThe potential use of AI to manipulate voters is deeply disturbing.â
The Trump campaign said it had nothing to do with the recording but declined further comment.




