FTC working to fight back against AI voice cloning

INDIANAPOLIS – Cyber criminals are getting better all the time at cloning the human voice to create realistic calls that can be used to trick unsuspecting scam victims out of money and information.

A savvy crook with the right equipment can use a very small sample, sometimes a single word, to clone a person’s voice.  That’s another reason why the so-called “can you hear me” phone scam is something to be avoided.

If a scammer calls you and asks “can you hear me,” when you answer “yes,” you’ve given them a recorded sample of how you sound.  From there, they can use that sample to call someone else while posing as you.

That’s why the Federal Trade Commission is leading an effort to fight back against cutting-edge criminals.  The FTC is actually holding a contest challenge for different companies to develop technology to counteract such voice cloning techniques.

“When the FTC announced its Voice Cloning Challenge last year, the main goal was to encourage innovative ways to help protect people from AI-enabled voice cloning harms,” a statement from the FTC said.

Last week, the agency announced four top prizes to the winning submissions:

  • A solution that would use algorithms to detect whether voice patterns are human or synthetic

  • A technology that would detect in real time voice cloning and deep fakes in incoming phone calls or digital audio in two-second chunks, assigning a “liveness score”

  • A proposal that would watermark audio with distortions that people would not be able to hear, but that could throw off AI voice cloners so that the audio could not be accurately cloned

  • A technology that would authenticate that a voice is human and embed the authentication as a type of watermark

You can learn more about the winning proposals on the Voice Cloning Challenge page.

The Voice Cloning Challenge is a part of the FTC’s ongoing work to ensure voice cloning technology isn’t used by scammers to cause harm. That work includes prevention of harms where possible, a proposed comprehensive ban on impersonation fraud, and applying the Telemarketing Sales Rule to AI-enabled scam calls.

It also includes warning consumers about the use of AI in scams — like when a scammer clones a family member’s voice, calls pretending to be in trouble, and then asks you to send money right away.

For the latest news, weather, sports, and streaming video, head to WTTV CBS4Indy.