Lingo Telecom agrees to $1 million fine over AI-generated Biden robocalls

Send a link to a friend  Share

[August 22, 2024]  By David Shepardson
 
(Reuters) - Lingo Telecom agreed to pay a $1 million fine after the agency said it transmitted fake robocalls imitating President Joe Biden seeking to dissuade people from voting for him in New Hampshire's Democratic primary election, a U.S. government regulator said on Wednesday. 

A man looking at his phone is seen through a digitally decorated glass. REUTERS/Aly Song/File Photo

The Federal Communications Commission said Lingo transmitted spoofed robocalls that used generative artificial intelligence voice-cloning technology "to spread disinformation." The calls were directed by political consultant Steve Kramer, who has been charged by the New Hampshire state attorney general's office.

The FCC earlier proposed fining Lingo $2 million for allegedly transmitting the robocalls in January. The FCC said Lingo under the settlement will implement a compliance plan requiring strict adherence to FCC caller ID authentication rules.

Lingo did not immediately respond to a request for comment.

"This settlement sends a strong message that communications service providers are the first line of defense against these threats and will be held accountable to ensure they do their part to protect the American public," FCC Enforcement Bureau Chief Loyaan Egal said.

Kramer faces charges after thousands of New Hampshire residents received a robocall message asking them not to vote until November.

Kramer told media outlets in February he paid $500 to have the calls sent to voters to call attention to the issue, after the calls were discovered in January. He had worked for Biden's challenger for the Democratic presidential nomination, U.S. Representative Dean Phillips, who denounced the calls.

The FCC has separately proposed fining Kramer $6 million over the robocalls.

The commission last month voted to propose requiring broadcast radio and television political advertisements to disclose whether content is generated by AI.

There is growing concern in Washington that AI-generated content could mislead voters in the Nov. 5 presidential and congressional elections. The FCC has said AI will likely play a substantial role in 2024 political ads.

The proposed rule would require on-air and written disclosures and cover cable operators, satellite TV and radio providers. The FCC does not have the authority to regulate internet or social media ads or streaming services.

(Reporting by David Shepardson; editing by Jonathan Oatis)

[© 2024 Thomson Reuters. All rights reserved.]

Copyright 2022 Reuters. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.  Thompson Reuters is solely responsible for this content.

 

 

Back to top