Voice Of Fox: Protecting Yourself from AI Voice Cloning Scams
Scammers are getting smarter when it comes to cybercrime, and they’ve found a new tool to aid their malicious activities: artificial intelligence (AI). They are now using AI technology to clone people’s voices and make convincing calls to their loved ones, tricking them into giving away money or sensitive information. It’s a disturbing trend that has been rapidly increasing in the past year.
According to Mike Scheumack, the chief innovation officer at IdentityIQ, a leading identity theft protection and credit score monitoring firm, AI has been advancing technology for quite some time now. However, it has recently started infiltrating the cybercriminal space, leading to a surge in AI voice cloning scams. These scammers can create realistic voice clones by using as little as a 3-second audio sample of the target’s voice.
To demonstrate the sophistication of AI voice cloning programs, IdentityIQ conducted an experiment. They used an audio sample from an interview on the “Fox News Rundown” podcast to create an AI voice clone. The clone then made a panicked phone call to a family member, requesting a cash app transfer following a fictitious car accident. The clone said, “Mom, I need to talk to you. I, I was going to interview someone today for a story I’m working on and I was involved in a car accident. I’m okay, but I need your help right now…”
These voice clone calls are often short and aim to create a sense of urgency. The scammers want to trigger the fight-or-flight response in their victims, making them more susceptible to their demands. Scheumack advises hanging up the call immediately if you receive such a call, and then calling your loved one directly to verify if it was indeed them or not.
AI voice scammers are not just relying on audio samples to make their calls more believable. They utilize AI programs to gather information about individuals from the internet, including social media posts, videos, and audios. This data helps them create more compelling calls that appear genuine to their unsuspecting victims.
These scams are not the work of individuals but rather sophisticated organizations. They have a team dedicated to researching victims online, another team to clone the voice, and even someone who goes to the victim’s house to collect money if the scam is successful. It’s a well-coordinated operation that preys on the vulnerability of individuals.
To protect yourself from falling victim to an AI voice cloning scam, Scheumack recommends being cautious about what you post online. Avoid sharing personal information that could be used to make these clone calls more convincing. Additionally, if you receive a call from an unknown number claiming to be someone you know and citing an urgent situation, take a moment to think before acting. It might be a red flag.
Implementing a secondary verification method is also a good precautionary measure. For example, you could establish a password prompted by a specific phrase that only your family members would know. This will help you confirm the identity of the caller in case of an emergency.
In this age of advanced technology, it’s crucial to stay vigilant and be aware of the tactics used by scammers. By taking these precautions, you can safeguard yourself and your loved ones from falling victim to AI voice cloning scams.