Jennifer DeStefano, shared a frightening encounter where phone scammers utilized AI voice cloning technology to deceive her into thinking that her teenage daughter was abducted.
According to DeStefano, a mother from Arizona, she received a call from an unfamiliar number while her daughter, 15, was away prompting her to answer the phone out of worry that something had happened to her.
DeStefano recounted that the caller's voice sounded just like her daughter's, which led her to believe that it was her daughter on the other end of the line.
Arizona Mother Got Scammed Using AI
According to Fox News, the caller, who had a voice that sounded exactly like her daughter's, told DeStefano that she had messed up and began crying.
The voice said, "Mom, I messed up." Suddenly, a man came on the line, threatening to harm her daughter.
The Arizona mother recalled the man saying that they would put her daughter on substances and have their way with her. The man also made a threat to leave her daughter in Mexico.
DeStefano started trembling uncontrollably when she heard her daughter's desperate cries for help in the background, saying "Help me, Mom. Please help me. Help me."
She was at her other daughter's dance studio at the time, and other mothers there helped her by calling 911 and her husband.
As reported by WKYT, they were able to confirm after a few minutes that her daughter was safe. DeStefano believed the call was genuine because her daughter was on a ski trip, and she answered out of concern for an accident.
However, it turned out that the voice on the phone was generated by an AI voice clone, and her daughter had no knowledge of it.
DeStefano was still shaken by the experience and felt that the voice sounded like her daughter's and had the same inflection and tone, which unnerved her deeply.
AI Technology's Voice Cloning
The recent news reveals that scammers are using voice clones generated by AI to deceive people into giving them money by claiming they have taken their loved ones hostage.
The Federal Trade Commission (FTC) has issued a warning to people to be cautious of such calls.
The FTC has mentioned that these criminals use AI to improve their schemes, and they ask for payment through wire transfers, cryptocurrencies, or gift cards.
To prevent falling victim to these scams, the FTC advises people to disconnect the call and reach out directly to the concerned person to confirm the story.
According to The Washington Post, the number of impersonation scams has been rising, and in 2022, over 36,000 people reported being scammed by imposters pretending to be friends or family.
The FTC officials also reported that more than $11 million was lost due to over 5,100 such incidents that happened over the phone.
In summary, scammers are using AI-generated voice clones to trick people into giving them money through impersonation schemes.
It is crucial to be careful and not trust such calls, but instead verify the story with the concerned person. The number of such scams has been rising, resulting in significant financial losses for people.