This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| less than a minute read

Regulators Raise Their Voice Over the Risk of AI Voice Cloning

A group of U.S. senators has sent a letter to the Consumer Financial Protection Bureau (CFPB) asking the agency to protect consumers from scams and fraud caused by the use of voice cloning. The senators raise a concern that scammers can use this technology to access consumers’ finances and bank accounts fraudulently. 

With a number of startup vendors providing AI tools to clone and generate voices, regulators across sectors are concerned about the potential misuse of this technology. Malicious actors can use voice cloning to trick victims into thinking they’re talking to a relative or, in this case, an authorized employee of a financial institution. Likewise, financial institutions that use voice authentication systems may also be vulnerable to breaches created by the use of voice cloning technology. 

It is unclear whether a fraudulently induced transaction is an “unauthorized electronic fund transfer” (EFT) under the EFTA, entitling the consumer to EFTA/Regulation E liability protection. 

“Voice cloning adds a new, threatening dimension to these scams, allowing fraudsters to generate voice clips to convincingly impersonate friends, family, or potentially even financial advisors and bank employees. Hearing trusted voices amplifies the risks of consumers falling victim to scams,”

Tags

privacy security & data innovations, artificial intelligence