Scammers are getting more sophisticated in their tactics, as evidenced by the recent trend of AI voice cloning scams. By using artificial intelligence to imitate the voices of loved ones, scammers are able to convince elderly and unsuspecting victims to hand over thousands of dollars in bail money or legal fees.
Ruth Card, a 73-year-old Canadian, was almost taken in when she thought she was talking to her grandson Brandon on the phone. “It was definitely this feeling of… fear,” Card told The Washington Post. “That we’ve got to help him right now.” Fortunately, a vigilant bank manager stopped them before they could withdraw any money.
Benjamin Perkin’s parents weren’t so lucky. They received a phone call from a lawyer claiming their son had killed a US diplomat and needed money for legal fees. The voice on the other end sounded just like their son, so they sent CAD $21,000 through BitCoin. “The money’s gone,” Perkin told the paper. “There’s no insurance. There’s no getting it back. It’s gone.”
Voice cloning scams have been around for a few years now, but with the rise of powerful and easy-to-use AI tools, it has become easier for scammers to impersonate voices convincingly. Hany Farid, a professor of forensics at UC Berkeley, told WaPo: “Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice. Now… if you have a Facebook page… or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice.”
The potential for abuse is immense, and it’s up to us to stay vigilant against these scams. As ElevenLabs’ voice cloning service shows — which costs as little as $5 per month — the technology is only going to become more accessible.