GECU Voices brings you guidance and insight from experts within the Credit Union. Today’s blog post was penned by Austin Vaive, Information Security Manager.
With the recent surge in popularity of artificial intelligence (AI), cyber criminals have begun leveraging voice impersonation technology to aid in their social engineering scams. Learn more about their methods and how to protect yourself or a loved one from this growing security threat.
How do AI voice scams work?
An imposter can use audio recordings from social media videos or other sources online to train an AI algorithm and generate a convincing audio recording. These impersonation attempts can be used to trick victims into acting or providing information, under the assumption that the caller is a trusted individual.
As an example, a scammer may attempt to target an elderly individual by searching social media to find the profile of one of the victim’s children or grandchildren. If the social media profile of the victim’s relative contains videos of them talking within their posts, the scammer can use an AI tool to craft an impersonation of their voice from this content.
This impersonation could then be used to call the victim and convince them their relative needs assistance. The bad actor will ask the victim to send funds quickly to resolve the issue. Common stories of this type of scam involve the “relative” needing money to make bail after getting arrested, or having a large debt coming due that requires immediate payment.
The script used by the scammer could go something like this:
“Hi Grandma/Grandpa, it’s your grandson/granddaughter, I really need your help. I was arrested last night, and the cops are saying I need $10,000 to post bail and get out. Please help me, I don’t want to call my parents because they can’t find out about this. Please don’t tell anyone, I’m so embarrassed this happened to me. The cops have my phone, so don’t try to call me back and verify it’s me. Please send a cashier’s check to the address I’m sending you ASAP. Thank you so much.”
This sort of scam can be effective because it preys on the victim’s desire to help their family member with a sensitive situation, and encourages them to keep their actions private. The scammer intentionally tells them to not tell anyone about the situation, and to not attempt to contact the “real” grandson or granddaughter, which would ruin the scam.
Celebrities are also a very common target of AI voice impersonation. Due to the large volume of voice recordings available for most famous individuals, it is very easy to craft a convincing voice impersonation utilizing AI, which can be used to create spoofed scam phone calls or phony social media posts encouraging you to click on suspicious links.
How can I identify an AI voice scam?
While AI voice scams can appear very convincing, there are a few ways to identify them.
- Stay on alert: Is the caller making an urgent request or demanding payment or action?
- Purpose: Does the reason for the call make sense? Is the call expected? Would you expect the person to make such a request?
- Validation: Does the phone number on file match the current caller-ID? Spoofed phone numbers are very common, so it is best to verify the legitimacy of the call by reaching out to the individual via a known legitimate contact method. Attempt to validate the identity of the caller by asking them questions that only the individual would have the correct answer to.
- Audio: Is the sound consistent? Is there any varying background noise or unusual speaking patterns? While AI spoofing can be very convincing, it’s still not 100% perfect, so any discrepancies in audio quality or speaking patterns should be noted as a red flag.
Keeping these potential red flags in mind can help identify a potential voice impersonation attempt, and if you are ever unsure of the legitimacy of the call, it’s always best to end the call and reconnect with the individual via a known-legitimate contact method. If you ever suspect a scam has compromised your accounts, report it to the Credit Union immediately.