Furthermore, AI imitations of either the current or former president might engender deceitful campaign ads and other forms of disinformation. Voters could become misinformed, confused, or otherwise led astray by convincing imitations of either candidate.
Imran Ahmed, the CEO of CCDH, warns there aren’t enough “guardrails” to contain the risks that AI poses to spread misinformation. He also believes the level of skill necessary to imitate the voices of Biden or Trump is so low that virtually anyone could do it.
CCDH tested six different AI tools: Descript, ElevenLabs, Veed, Speechify, Invideo AI, and PlayHT. After doing so, the company concluded these tools have a long way to go before the technology’s been proofed to prevent political misinformation.
During its reviews, CCDH also tested the tools’ capacity to imitate the voices of not just Trump and Biden, but also other political leaders around the world.
This could prompt more calls for regulations
Polling already shows that significant amounts of Americans favor AI regulations. The CCDH’s findings only further underscore the risks that a lack of control over artificial intelligence poses.