Accessibility links

Breaking News

‘Deepfake’ of Biden’s Voice Called Early Example of US Election Disinformation

In this file photo, U.S. President Joe Biden speaks in the East Room of the White House, on January 19, 2024,. (AP Photo/Evan Vucci, File)
In this file photo, U.S. President Joe Biden speaks in the East Room of the White House, on January 19, 2024,. (AP Photo/Evan Vucci, File)
‘Deepfake’ of Biden’s Voice Called Early Example of US Election Disinformation
please wait

No media source currently available

0:00 0:06:53 0:00

A fake voice message claiming to be U.S. President Joe Biden was recently sent to voters in an example of possible election disinformation.

The voice message is an example of what is known as a “deepfake.” A “deepfake” is a piece of audio or video created to make it appear that people in it are saying or doing things that they never did. Newly developed artificial intelligence (AI) tools make it easier for people to make and publish such content.

The fake voice of Biden was included in a political message sent by telephone directly to voters in the northeastern state of New Hampshire. This marketing method, known as “robocalling,” involves a machine that calls large numbers of people to play them a prerecorded message.

The calls are being investigated by New Hampshire election officials. The state recently held America’s first presidential primary contest. Primary elections in the United States usually choose the candidates for the presidential election in November.

It is not known who created the fake audio, which was heard by reporters from several news organizations. In it, Biden seems to try to persuade voters not to take part in the primary.

The voice says, “Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again.” The message continues, “Your vote makes a difference in November, not this Tuesday.”

Voting in state primaries does not affect a person’s vote later in the U.S. presidential election, which is set for November 5, 2024.

The Biden robocalls even included language the U.S. president has often used in the past, specifically the phrase, “What a bunch of malarkey." The phrase is used to describe something foolish, unreasonable or meaningless. The phone message advises listeners to “save your vote for the November election.”

It is not known how many people received the call with Biden’s voice. Biden did not campaign in New Hampshire and his name was not on the state’s primary ballot. This is because his campaign decided to begin the primary process in the state of South Carolina instead. However, a New Hampshire state law requires it to hold the nation’s first primary.

White House press secretary Karine Jean-Pierre confirmed the call was “fake and not recorded by the president.” The campaign for former President Donald Trump said it was not involved with the call.

The head of Biden’s re-election campaign, Julie Chavez Rodriguez, said in a statement the organization was investigating the incident and “actively discussing additional actions to take.” Rodriguez said the robocall was a clear attempt “to suppress voting and deliberately undermine free and fair elections.”

Robert Weissman is head of the nonprofit citizen activist group Public Citizen. He said in a statement the Biden robocall provided fresh evidence that “the political deepfake moment is here.” He urged governments to consider passing legislation to limit the use of deepfakes to prevent election “chaos.”

Deepfakes created with AI technology have already appeared in campaign advertisements in the 2024 presidential race, the Associated Press (AP) reports. The technology has also been used to spread false information during election campaigns across the world, from Slovakia to Indonesia to Taiwan.

Hany Farid is a digital investigations expert at the University of California, Berkeley. He told the AP the Biden robocall demonstrates how AI methods can be “weaponized” in elections. Farid added that recent cases of audio and video used to mislead voters “is surely a sign of things to come.”

As AI technologies improve, governments around the world are seeking ways to restrict them from causing public harm. The U.S. Congress has yet to pass legislation to limit the use of such technologies in the political process. The Federal Election Commission is currently considering public comments on a petition for it to restrict AI-created deepfakes in campaign advertisements.

David Becker heads the nonprofit Center for Election Innovation and Research. He told the AP the use of deepfakes is just the latest tool in a long history of “dirty tricks” aimed at influencing the political process.

Becker added that the goal of such efforts can be to confuse voters to a point where they no longer recognize false information from the truth. “They don’t need to convince us that what they’re saying, the lies they’re telling us, are true,” he said. "They just need to convince us that there is no truth, that you can’t believe anything you’re told.”

I’m Bryan Lynn.

Bryan Lynn wrote this story for VOA Learning English, based on reports from The Associated Press, Reuters and online sources.

Quiz - ‘Deepfake’ of Biden’s Voice Called Early Example of US Election Disinformation

Quiz - ‘Deepfake’ of Biden’s Voice Called Early Example of US Election Disinformation

Start the Quiz to find out


Words in This Story

fake – adj. false, not true

quest – n. an attempt to get or do something difficult

deliberate – adj. done purposefully, or planned

undermine – v. to lessen the effectiveness or ability of something to move forward

chaos – n. a situation in which there is no order at all

petition – n. to officially ask permission to do something

confuse – v. to make someone unable to think clearly or understand something

convince – v. to make someone believe that something is true