Artificial Intelligence

How to protect yourself from AI voice scams

Scripps News digs into the growing dilemma surrounding deepfakes, scams, and A.I. regulation.

How to protect yourself from AI voice scams
Lea Suzuki/San Francisco Chronicle via AP
SMS

Using artificial intelligence, everyone from hobbyists to politicians to crooks is flooding the internet with a variety of fake media. But voice cloning may be the one technique right now that poses the greatest threat to unsuspecting victims.

"AI is revolutionizing and unraveling the very foundation of our social fabric by creating doubt and fear in what was once never questioned, the sound of a loved one's voice," said Jennifer DeStafano of Scottsdale, Arizona, in a recent Senate hearing.

"Over a quarter of the population has direct impact to these AI-powered voice scams, and that would be either directly impacted or knowing somebody that has been impacted," said Steve Grobman, the Chief Technology Officer with security firm McAfee.

Voice cloning is one type of deepfake. That's a term combining the words "deep learning" and "fake." Deepfakes are realistic but fake videos of real people digitally altered to look like others. Visual examples range from the merely amusing, like the fake photo of Pope Francis wearing a white puffer jacket, to deepfakes that depict an individual engaged in a sexual act. The Ron DeSantis campaign recently tweeted a video showing still images of Donald Trump hugging and kissing infectious disease expert Dr. Tony Fauci. Digital forensic expert Hany Farid told CNN that inconsistencies in the image give it away as an AI image.

How to tell the difference between a deepfake video and a real one
How to tell the difference between a deepfake video and a real one

How to tell the difference between a deepfake video and a real one

There are some things to pay attention to when identifying a deepfake video, but artificial intelligence is making detection more difficult.

LEARN MORE

Meanwhile, scams using voice cloning are also becoming common, according to security firm McAfee. One in ten adults surveyed worldwide by the firm this year said they had been targeted for a scam using voice cloning. An additional 15% said they knew someone who had been targeted.

Imposter scams of all types—using traditional methods or voice cloning—resulted in losses of $2.6 billion in the U.S. in 2022, according to the Federal Trade Commission. Scammers can use as few as three seconds of a voice recording to create a realistic fake audio of a loved one that the targeted victim knows, says Grobman. "For more technically adept cybercriminals, there are low-level tools with lots of dials and knobs that they can find exactly how they want things to come out. But there's also tools that exist that are simple websites, and a non-technical cybercriminal can simply upload a sample of their voice and then use the website to generate the dialog," he said.

And criminals have myriad sources of individual voices to exploit. "There's lots of voice in the digital world that we live in today," added Grobman. "It can be in YouTube videos, other social media. It can even be something as simple as a voicemail greeting."

Once a scammer has a voice sample, says Grobman, "they'll basically train an artificial intelligence model with that sample that will then allow them to create any sort of dialog or narrative that sounds almost identical to the person that is being victimized that they're familiar with."

Senators show bipartisan concern about rapid expansion of AI
Senators show bipartisan concern about rapid expansion of AI

Senators show bipartisan concern about rapid expansion of AI

Senators shared concerns about privacy, and the need to hold tech companies accountable for any harm caused by AI.

LEARN MORE

DeStefano told a U.S. Senate subcommittee in June of a recent disturbing phone call she received from an unknown number. "On the other end was our daughter Briana sobbing and crying, saying, 'Mom, I messed up," recalled the Scottsdale, Arizona, mother. "What happened?" DeStefano asked. "Suddenly a man's voice barked at her daughter to "lay down and put your head back".

"At that moment, I started to panic," DeStefano told the lawmakers. "My concern escalated, and I demanded to know what was going on, but nothing could have prepared me for her response. Mom, these bad men have me; help me, help me!"

"A threatening and vulgar man took over the call," she continued. He told her: "Listen here, I have your daughter; you tell anyone; you call the cops; I am going to pump her stomach so full of drugs; I am going to have my way with her; drop her in Mexico; and you'll never see her again!"

But DeStefano soon confirmed that her daughter was sleeping safely at home.

Other victims, however, have sent thousands of dollars to scammers, thinking their loved ones, impersonated by AI, had been kidnapped or stranded.

So what can consumers do to protect themselves from one of these scams? "First and foremost, if you get a phone call that sounds like a loved one is in distress, do everything in your power to make contact with the person on the other end of the line," says Grobman. "Ask a question that only the person on the other end of the line should know the answer to. Well, can you remind me what the last movie we saw or what did we have for dinner when we ate together last?"

Grobman says there are steps that can prevent individuals from getting targeted in the first place. "Limiting their public digital footprint is key. Cleaning up your name, address, personal details from data brokers makes it more difficult for scammers to create the narratives that they can use to build a scam."

This story has been edited to reflect the correct spelling of Grobman.