According to the FTC, in 2022, imposter scams were the second leading form of scams, resulting in over $11 million in losses.*
Now, Artificial Intelligence (known as AI) is another tactic of imposter phone call scams. With the advancements in AI, a scammer can now use AI to clone the voice of your loved one with only a short audio clip of their voice. A way this could be retrieved is simply from audio content posted online via social media. Once a scammer clones the voice, they will sound just like your loved one on the other end of the call.
In an imposter scam, it is already hard to be able to identify the perpetrator, and difficult for police to trace calls and funds from scammers. Adding in AI voice replicators makes this even more challenging.**
Although anyone could be a victim of this scam, the primary target for phone imposter scams are the elderly. AI voice-generating software analyzes what makes a person’s voice unique to recreate the person’s exact voice. This makes it nearly impossible for a victim to believe it is not actually their family member on the other line – causing panic and a rush to send money to help right away.
Safeguard yourself and any elderly family members against AI voice impersonations:
- Don’t trust the voice. Call the person who supposedly contacted you to verify legitimacy.
- Don’t always give in to the urgency. Unless you can verify by contacting the person or their family member, it is a scam.
- Don’t pay through money wire, cryptocurrency, or gift card. Scammers use these pay methods because they know it is hard to reverse money.
- Don’t provide gift card numbers or PINs over the phone. This may seem like the easiest method for sending payment – however, once the scammer has the card numbers they need, the money is theirs.
If you think you may be the victim of a phone impersonation scam, report it to the FTC at ReportFraud.ftc.gov.
**Thousands Scammed by AI Voices Mimicking Loved Ones in Emergencies, Ashley Belanger, ARS Technica.