A mother in Arizona is warning people about a phone scam that nearly cost her $50,000. The scam involved artificial intelligence cloning the voice of her 15-year-old daughter in a phone call which left the mother terrified.
Jennifer DeStefano was confused and scared when she picked up the phone to hear her daughter sobbing and telling her she had "messed up."
A man’s voice then came on the line, demanding $1 million to release DeStefano's daughter, before lowering the demand to $50,000 when she said she didn't have the money.
\u201cA mother in Arizona is warning others about a terrifying phone scam involving artificial intelligence that can clone a loved one\u2019s voice. https://t.co/x1AAmQ2KpP\u201d— WITN Headlines (@WITN Headlines) 1681247940
DeStefano recalled:
“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano recalled. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.” ...
“This man gets on the phone and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico.'"
“And at that moment, I just started shaking. In the background she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”
The call, which came from an unfamiliar number, prompted DeStefano to take action. She kept the man talking, while a friend called the police and another called DeStefano’s husband. Within four minutes, they confirmed her daughter was safe. DeStefano then hung up, realizing that the voice on the phone was just a clone created by artificial intelligence.
DeStefano said:
“She was upstairs in her room going, ‘What? What’s going on?’” Then I get angry, obviously, with these guys. This is not something you play around with.”
Subbarao Kambhampati, a computer science professor at Arizona State University who specializes in AI, said voice cloning technology is improving at a fast pace. In the past, cloning a voice would require a large number of samples from the person being cloned, but Kambhampati said a voice can now be cloned with just three seconds of audio.
Kambhampati added that the technology can capture inflection and emotion, making it difficult to distinguish from a real voice. Deep learning technology currently has little oversight, and Kambhampati warns that while there may be good uses, there are also worrisome ones.
The story disturbed many online who openly opined about the dangers of such technology.
\u201cThis seems like it should be an urban legend but unfortunately it\u2019s actually happening! #AI #ChatGPT #voiceclone #arizona #Scottsdale https://t.co/5pXTUXNbuZ\u201d— Karen Mensing (@Karen Mensing) 1681273214
\u201cAI is the hubris that leads to our demise.\n\nArizona Mother Warns About AI Voice Cloning After Kidnapping Scam https://t.co/uGKPEzR39o\u201d— Darth Darksaber (@Darth Darksaber) 1681482880
\u201cVoice cloning scams have reached the valley\ud83d\ude31, and I spoke to @SusanCampbellTV of @AZfamily's Good Morning Arizona for this scary segment (aired this AM and again at 6:30pm). And yes, that is my office.. \n\nThe era of trusting our senses is ending.. \ud83d\ude30\n\nhttps://t.co/kOces5yzLZ\u201d— Subbarao Kambhampati (\u0c15\u0c02\u0c2d\u0c02\u0c2a\u0c3e\u0c1f\u0c3f \u0c38\u0c41\u0c2c\u0c4d\u0c2c\u0c3e\u0c30\u0c3e\u0c35\u0c41) (@Subbarao Kambhampati (\u0c15\u0c02\u0c2d\u0c02\u0c2a\u0c3e\u0c1f\u0c3f \u0c38\u0c41\u0c2c\u0c4d\u0c2c\u0c3e\u0c30\u0c3e\u0c35\u0c41)) 1681142763
\u201c@elonmusk This is just another reason to dislike AI technologies, not to mention the absolute destruction of customer service.\nArizona Mother Warns About AI Voice Cloning After Kidnapping Scam\nhttps://t.co/RsMxCe9XdF\u201d— SDGReports (@SDGReports) 1681482828
\u201c\ud83d\udea8 Can you trust your ears? \ud83d\udea8 An Arizona mom falls prey to a terrifying AI voice cloning scam, thinking her daughter was kidnapped! \ud83d\ude31 Are you ready for this level of tech trickery? #PhoneScams #KidnappingHoax #Aiville https://t.co/gmW4Ft4HyJ\u201d— Aiville.com (@Aiville.com) 1681324707
\u201cAI is not going to be good at all! \n\nArizona Mother Warns About AI Voice Cloning After Kidnapping Scam https://t.co/0dNluLCWMg\u201d— laura tibbs (@laura tibbs) 1681483034
\u201cDoes your family have a code word? Discussed on pencil and paper of course. https://t.co/zFO1ocHzB5\u201d— Crash (@Crash) 1681519917
\u201cHow frightening!\nhttps://t.co/FFWGXVONli\u201d— Beth V (@Beth V) 1681481193
Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office, said scammers search for prey on social media.
To avoid becoming a victim of similar scams, Mayo urges everyone to ramp up the privacy settings on their profiles. Scammers often ask victims to wire money, send cryptocurrency or pay ransom with gift cards. Getting the money back is practically impossible once it's been transferred.
According to Mayo, indicators to be cautious of comprise of the phone number originating from an unfamiliar area code, the phone number being from another country, and the person on the phone preventing you from speaking with other family members for assistance.
In the event that an individual who claims to have knowledge of a loved one's whereabouts contacts you, Mayo recommends taking your time, inquiring about the situation, and seeking specific information about the person that a fraudster would not have.