CHICAGO — Sneaky phone scammers are convincing people to send them money by simulating the voices of the victims’ loved ones with artificial intelligence (AI), Illinois Atty. General Kwame Raoul said today.
False claims of a loved one being in trouble are a tried-and-true tactic for con artists. Now, Raoul says, criminals may be generating statements in a loved one’s voice by running audio from social media videos through AI.
“Getting a call from what sounds like a family member in distress is upsetting, but you may not be able to trust the identity of the voice on the line,” Raoul said in a press release, calling the development “unnerving.”
Scammers are finding social media videos that feature a potential target’s loved one. Then, they run the audio through AI to create a realistic imitation of the loved one.
With the voice created, they can now call the victim and make it sound like someone they care about is seeking help, according to Raoul.
“Getting a call from what sounds like a family member in distress is upsetting, but you may not be able to trust the identity of the voice on the line,” the attorney general said.
“These criminals are very good and convincing at what they do. Their goal is to catch you off guard, scare you into sending payment and disappear before you realize what happened,” Raoul continued. “Take a deep breath, slow down and take steps to confirm the identity of the caller, especially if they are pushing you to send a payment or disclose personal information.”
If you’re unsure about a request for help on the phone, hang up and call the person back on a number you know is theirs, Raoul suggested.
“Families can also choose a codeword or phrase that they can use to identify each other,” said Raoul. He cautioned against sharing the codeword, particularly on social media.
The press release also noted that scam artists frequently ask victims to provide money in “unusual” ways, like with a gift card or apps.
Illinois authorities have not received reports of AI scams operating in the state yet, Raoul’s office said, but Federal authorities warned about voice-cloning scams last year.