With 102 days to go until the 2024 U.S. presidential election, the FCC on Thursday announced it would move forward with new proposed guidelines requiring political ads on TV and radio to include on-air disclosures if AI-generated content was used.
“Today, the FCC takes a major step to guard against AI being used by bad actors to spread chaos and confusion in our elections,” FCC Chairwoman Jessica Rosenworcel said in a statement announcing the new rule. “There’s too much potential for AI to manipulate voices and images in political advertising to do nothing. If a candidate or issue campaign used AI to create an ad, the public has a right to know.”
AI tools have already been used to interfere with the 2024 presidential race, the FCC noted, citing AI-generated robocalls that mimicked President Joe Biden’s voice and discouraged people from voting in New Hampshire's primary.
The new rules would not impact online political advertisements. And with the November presidential election only around three months away, it's unclear whether they would go into effect before Election Day.After the commission’s 3-2 vote on Thursday, the proposal now advances to a 30-day public comment period. After that, there's another 15-day reply period before the rule can be finalized.
Even if the rules do go into effect by Nov. 5, experts fear they don't go far enough in addressing the corrosive effects of AI-generated disinformation.
Sander van der Linden, a professor of social psychology in society at University of Cambridge, told Courthouse News that while the proposal is significant, it is "likely insufficient on its own."
“We know that at a psychological level, warning people about the presence of deepfakes does not necessarily mitigate its harmful effects,” he said. He noted that while these fake videos violate the policies of social-media platforms like Facebook, hundreds nonetheless surfaced during the UK election. "Is there sufficient capacity to enforce the policy?"
Matt Motta, an assistant professor of health law, policy and management at Boston University, noted that the FCC proposal does not require disclosure in advertisements shared online or on streaming services — including videos posted on social media and image-based banner advertisements.
Still, Motta stressed the rule "could have important consequences for the upcoming election if finalized within the next few months, as it would make it more difficult for campaigns to share ‘deep fake’ alterations of audio and/or visual content, in order to attempt to manipulate voters."
Nearly half of all states have enacted laws to regulate the use of AI and deepfake technology in elections.
No comments:
Post a Comment