A mother has said she was “100 per cent” convinced by an AI voice clone of her daughter that scammers used in a faked kidnapping attempt.
Jennifer DeStefano from Arizona picked up a call from an unknown number and heard what she believed to be her 15-year-old daughter Brie “sobbing”.
The voice on the other end of the line said, “mom, I messed up”, before a male voice took over and made threatening demands.
“This man gets on the phone, and he’s like, ‘listen here, I’ve got your daughter’,” Ms DeStafano told local news show WKYT.
The apparent kidnapper told the mother: “You call the police, you call anybody, I’m going to pop her so full of drugs, I’m going to have my way with her, and I’m going to drop her off in Mexico.”
In the background, Ms DeStefano said she could hear her daughter saying “help me, mom, please help me” and crying.
“It was 100 per cent her voice,” Ms DeStefano said.
“It was never a question of who is this? It was completely her voice, it was her inflection, it was the way she would have cried – I never doubted for one second it was her. That was the freaky part that really got me to my core.”
The apparent kidnapper demanded $1 million for the daughter to be released, before lowering the figure to $50,000. Ms DeStefano only realised that her daughter was safe after a friend called her husband and confirmed that she was safe.
The scammer, who police are still investigating, appears to have used artificial intelligence voice clone technology, which has become increasingly competent in recent years at mimicking people’s voices.
It is also relatively easy to access and use, with AI tools freely available on the internet.
AI-generated voices have already been used in films to replicate actors like James Earl Jones who voiced the original Darth Vader character in the Star Wars franchise.
It has also been touted as a way for authors to produce audiobooks without spending hours in a studio reading their work.
The technology has also raised fears that it could be misused by scammers or people attempting to create deepfake videos, with one voice clone startup warning that it could be “harmful in the wrong hands”.
In a press release last year, the firm Respeecher said: “The widespread attention and adoption of deepfake technologies will accelerate the need to regulate the use of the technology.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies