In a startling incident that underscores the potential misuse of artificial intelligence in politics, a robocall featuring an AI-generated voice resembling President Joe Biden circulated in New Hampshire, causing confusion and concern among voters. This incident, which occurred ahead of the state’s primary election, has been described as “outright election interference” and is currently under investigation by the New Hampshire attorney general’s office.
The robocall began with the phrase “What a bunch of malarkey,” a well-known Biden catchphrase, and proceeded to advise Democrats to “save your vote for the November election.” The implication was that voting in the primary would only aid Republicans in their quest to re-elect Donald Trump. This message, delivered in a voice eerily similar to Biden’s, was designed to discourage Democrats from participating in the primary, a clear attempt at voter suppression.
Adding to the confusion, the call concluded with a phone number belonging to Kathy Sullivan, a former New Hampshire Democratic Party chair who is now running a super PAC supporting Biden. Sullivan has vehemently denied any involvement with the call, stating that she did not authorize it and referring to it as “outright election interference”.
Unraveling the AI-Generated Joe Biden Robocall: Misinformation, Investigation, and Implications
The incident has raised serious concerns about the potential for AI to be used in misleading or malicious ways during election periods. The use of an AI-generated voice that closely mimics Biden’s is a particularly troubling aspect of this incident. It highlights the sophistication of the technology involved and the ease with which it can be used to spread misinformation and sow confusion among voters.
The origin of the robocall and the extent of its reach remain unclear. Lists of voters’ phone numbers can be readily purchased from data brokers, and it’s not known how many voters received the call or which types of voters were targeted. The audio of the call was reviewed by CNN from the anti-robocall application Nomorobo, which suggests a large volume of calls.
The New Hampshire attorney general’s office is investigating the matter as an “unlawful attempt to disrupt the New Hampshire Presidential Primary”. As of now, no specific individual or group has been definitively identified as being behind the call.
How Resemble Detect can prevent AI Generated Election Interference
The incident has underscored the urgent need for tools and measures to combat the misuse of AI in politics. One such tool is Resemble Detect, an AI tool that can analyze audio files and determine whether they were generated by an AI. It uses machine learning algorithms to compare the characteristics of the audio file with those of known AI-generated voices. In the case of the fake Joe Biden robocall, Resemble Detect could potentially have been used to confirm that the voice in the call was AI-generated, providing evidence of foul play.
In the case of the fake Joe Biden robocall, Resemble Detect was used to confirm that the voice in the call was AI-generated, providing evidence of foul play.
Moreover, Resemble Detect could be used proactively to monitor calls and flag potential AI-generated voices. This could help in identifying and stopping such calls before they reach a large number of people. It could also be used to alert authorities and the public about the presence of such calls, helping to counteract their potential impact.