JAKARTA - A political consultant from the Democratic Party in Louisiana has been charged with a fake robocall imitating US President Joe Biden's vote with the aim of preventing people from voting for him in the Democratic Party's main election in New Hampshire. This was said by the Attorney General's Office of New Hampshire on Thursday, May 23.

Steven Kramer, 54, faces 13 counts of crimes of voter oppression and 13 counts of criminal offenses disguised as candidates after thousands of New Hampshire residents received a robocall message asking them not to vote until November. Kramer faces a series of preliminary court hearings that began on June 14 at the Merrimack High Court.

Lawyers for Kramer have yet to be identified. Kramer himself did not want to respond to requests for comment from the media.

Kramer told CBS and NBC in February that he had paid $500 to send the call to voters to draw attention to the matter, after the call was discovered in January. He had worked for Biden's main challenger, Dean Phillips' representative, who criticized the call.

Separately, the Federal Communications Commission (FCC) on Thursday proposed a fine of USD 6 million (IDR 96.2 billion) against Kramer over the robocall which it said used deepfake audio recordings generated by AI from Biden's vote that had been cloned, arguing that its rules prohibit inaccurate transmitter ID information transmission.

"When the caller sounds like a politician you know, a celebrity you love, or a familiar family member, anyone of us can be deceived into believing something is not true with calls using AI technology," said FCC Chair Jessica Rosenworcel.

The FCC also proposed a fine of US$2 million (Rp32 billion) against Lingo Telecom for allegedly transmitting the robocall.

There is growing concern in Washington that AI-generated content could mislead voters in presidential and congressional elections in November. Some senators want to pass laws before November that will address AI threats to election integrity.

"New Hampshire remains committed to ensuring that our election remains free from unlawful interference and our investigation into the matter continues," said Attorney General John Formella, a Republican.

Formella hopes the state and federal action "sends a strong signal of deterrence to anyone who might consider disrupting elections, either through the use of artificial intelligence or other means."

A campaign spokesman Joe Biden said the campaign "has formed an inter-department team to prepare for the potential impact of AI on this election, including the threat of malicious deepfakes." The team has been around since September "and has various tools available to deal with the issue."

On Wednesday, May 22, Rosenworcel proposed requiring disclosure of content generated by artificial intelligence (AI) in political advertisements on radio and TV for candidate ads and issues, but not prohibiting AI-generated content.

The FCC says the use of AI is expected to play a substantial role in political advertising in 2024. The FCC highlights the potential to mislead "deepfakes" which are "reformed images, videos, or audio recordings that describe people doing or saying things they don't actually do or say, or events that don't actually happen."


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)