Phone scammers are using AI to mimic voices of victims’ friends and relatives: attorney general

Toni Hou via Flickr

CHICAGO — Sneaky phone scammers are convincing people to send them money by simulating the voices of the victims’ loved ones with artificial intelligence (AI), Illinois Atty. General Kwame Raoul said today.

False claims of a loved one being in trouble are a tried-and-true tactic for con artists. Now, Raoul says, criminals may be generating statements in a loved one’s voice by running audio from social media videos through AI.

“Getting a call from what sounds like a family member in distress is upsetting, but you may not be able to trust the identity of the voice on the line,” Raoul said in a press release, calling the development “unnerving.”

Scammers are finding social media videos that feature a potential target’s loved one. Then, they run the audio through AI to create a realistic imitation of the loved one.

With the voice created, they can now call the victim and make it sound like someone they care about is seeking help, according to Raoul.

“Getting a call from what sounds like a family member in distress is upsetting, but you may not be able to trust the identity of the voice on the line,” the attorney general said.

“These criminals are very good and convincing at what they do. Their goal is to catch you off guard, scare you into sending payment and disappear before you realize what happened,” Raoul continued. “Take a deep breath, slow down and take steps to confirm the identity of the caller, especially if they are pushing you to send a payment or disclose personal information.”

If you’re unsure about a request for help on the phone, hang up and call the person back on a number you know is theirs, Raoul suggested.

“Families can also choose a codeword or phrase that they can use to identify each other,” said Raoul. He cautioned against sharing the codeword, particularly on social media. 

The press release also noted that scam artists frequently ask victims to provide money in “unusual” ways, like with a gift card or apps.

Illinois authorities have not received reports of AI scams operating in the state yet, Raoul’s office said, but Federal authorities warned about voice-cloning scams last year.

About CWBChicago 6770 Articles
CWBChicago was created in 2013 by five residents of Wrigleyville and Boystown who had grown disheartened with inaccurate information that was being provided at local Community Policing (CAPS) meetings. Our coverage area has expanded since then to cover Lincoln Park, River North, The Loop, Uptown, and other North Side Areas. But our mission remains unchanged: To provide original public safety reporting with better context and greater detail than mainstream media outlets. Our editorial email address is news@cwbchicago.com