Artificial Intelligence (AI) is a powerful tool that can be misused for criminal purposes. The biggest risk experts see in AI is its ability to blur the lines between what is real and what is not. Cybercriminals, in particular, can use AI to spread disinformation. A new breed of scams has emerged in the US, where fraudsters use AI voice cloning tools to impersonate family members and steal from people. This type of scam has alarmed US authorities, and it has become increasingly prevalent.
Jennifer DeStefano, a mother from Arizona, received a call that she believed was from her 15-year-old daughter, who was away on a skiing trip. The voice on the phone was crying and pleading for help. DeStefano was convinced that it was her daughter’s voice and did not doubt it for a second. However, it turned out to be an AI clone, and the scammer demanded up to $1 million. The case is now under police investigation.
AI voice cloning tools are available online and can be used to create a clone of a person’s voice. All it takes is a small sample of their real voice that can be easily obtained from content posted online. With a cloned voice, scammers can extract information and funds from victims more effectively. The AI-powered ruse can be over within minutes, leaving the victim devastated and vulnerable.
In a survey of 7,000 people from nine countries, one in four people said they had experienced an AI voice cloning scam or knew someone who had. The respondents said they were not confident they could “tell the difference between a cloned voice and the real thing.” The rise in voice cloning scams has become so prevalent that American officials have warned of a surge in the “grandparent scam.”
The grandparent scam is where an imposter poses as a grandchild in urgent need of money in a distressful situation. The scammer calls and pleads for help, sounding just like the grandchild. The person receiving the call is often an elderly person who believes it to be genuine. The scammer then asks for money, claiming to be in a dire situation. The scam is so convincing that elderly people have been duped into sending money to scammers.
AI voice cloning tools can generate highly realistic voice clones, making anyone with an online presence vulnerable to an attack. These scams are gaining traction and spreading. AI startup ElevenLabs admitted that its voice cloning tool could be misused for “malicious purposes” after users posted a deepfake audio of actor Emma Watson reading Adolf Hitler’s biography, “Mein Kampf.”
The rise in AI voice cloning scams has alarmed experts who believe that new technology is needed to verify the identity of the person on the other end of the line. Gal Tal-Hochberg, the Group Chief Technology Officer at the venture capital firm Team8, believes that we are fast approaching the point where we can no longer trust the things we see on the internet. He says that new technology is needed to know if the person you are talking to is actually the person you think you are talking to.
AI voice cloning tools pose a significant risk to individuals and society. They can be used to carry out scams, spread disinformation, and create convincing deep fakes. The rise in AI voice cloning scams has alarmed US authorities, who are warning people to be vigilant. Experts believe that new technology is needed to verify the identity of the person on the other end of the line. As AI continues to advance, it is crucial to ensure that it is used for good and not for criminal purposes.
Leave a Reply