SHOCKING: Arizona mom receives AI-generated ‘kidnapping’ call with daughter's cloned voice

"It was completely her voice. It was her inflection. It was the way she would have cried."

ADVERTISEMENT

"It was completely her voice. It was her inflection. It was the way she would have cried."

Image
Hannah Nightingale Washington DC
ADVERTISEMENT

An Arizona mother was the victim of a terrifying phone scam involving the use of artificial intelligence to clone her 15-year-old daughter’s voice.

Speaking with WKYT, Jennifer DeStefano said that she got a call from an unknown number and almost let it go to her voicemail. Her daughter though was out of town skiing, so she decided to pick up the phone, fearing the potential of an accident. 

"I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing," DeStefano said. "I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying." 

Over the phone, DeStefano said she heard the voice of a man say "put your head back, lie down."

It was at this moment that DeStefano’s confusion over the phone call turned into fear.

"This man gets on the phone and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico,’" DeStefano said. "And at that moment, I just started shaking. In the background she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling."

The man demanded $1 million from DeStefano, but lowered it to $500,000 when she said she didn’t have the funds to pay such a sum.

DeStefano kept the man talking. She had taken the phone call while at her other daughter’s dance studio, and was surrounded by other worried moms, one of whom called 911, and another called DeStefano’s husband.

DeStefano and the other mothers were able to confirm within four minutes.

"She was upstairs in her room going, ‘What? What’s going on? Then I get angry, obviously, with these guys. This is not something you play around with," DeStefano said. She hung up the phone once knowing her daughter was safe.

DeStefano said that she "never doubted for one second it was her. That’s the freaky part that really got me to my core."

"It was completely her voice. It was her inflection. It was the way she would have cried."

Subbarao Kambhampati, a computer science professor at Arizona State University specializing in AI, told the outlet that voice cloning technology has rapidly improved in recent years, now requiring just three seconds of a person’s voice to be cloned.

"And with the three seconds, it can come close to how exactly you sound," Kambhampati said. "Most of the voice cloning actually captures the inflection as well as the emotion."

He said that deep learning technology currently has very little oversight, and has become easier to access and use by the public.

“It’s a new toy, and I think there could be good uses, but certainly there can be pretty worrisome uses too,” he said.

Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office, said that scammers who use such technology often find their voice targets on social media, with Mayo urging the public to keep their social media profiles on private.

"You’ve got to keep that stuff locked down. The problem is, if you have it public, you’re allowing yourself to be scammed by people like this, because they’re going to be looking for public profiles that have as much information as possible on you, and when they get a hold of that, they’re going to dig into you," Mayo said.

Mayo said that anyone who may fall into the same situation should ask "a bunch of questions," because the person these scammers say they have is a loved one that the target person may know a lot of details about that the scammer doesn’t know.

"You start asking questions about who it is and different details of their background that are not publicly available, you’re going to find out real quick that it’s a scam artist."

Mayo said that these sorts of scam calls about a fake kidnapping or family emergency using an AI-cloned voice "happens on a daily basis," but added that not everyone reports these calls.

ADVERTISEMENT
ADVERTISEMENT

Join and support independent free thinkers!

We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.

Support The Post Millennial

Remind me next month

To find out what personal data we collect and how we use it, please visit our Privacy Policy

ADVERTISEMENT
ADVERTISEMENT
By signing up you agree to our Terms of Use and Privacy Policy
Also on PM.
ADVERTISEMENT
© 2024 The Post Millennial, Privacy Policy | Do Not Sell My Personal Information