It’s late at night. Your parents receive a voice message from you filled with panic, stating that you’re in trouble, your wallet has been stolen, and you’re stuck in the middle of nowhere. Money needs to be wired to you immediately so you’re able to get home.

The voice in the message sounds so real and is unmistakably that of their child—only it isn’t. It’s an AI voice clone, and they have become the target of a scam that’s as cutting-edge as it is chillingly effective.

Sceenshot of the ElevenLabs AI speech classifier showing human result

How Does an AI Voice Cloning Scam Work?

Voice cloning can create incredibly realistic digital replicas of a person’s voice. This technique, commonly called an audiodeepfake, is created withAI voice cloning tools like ElevenLabs(check out the AI voice clone example below). Combined with generative AI and voice synthesis, the cloned voice can replicate emotions, nuances, intonation, and even fear.

And for reference, here’s is how Christian Cawley really sounds, talking on The Really Useful Podcast.

Screenshot of ElevenLabs AI speech classifier showing AI result

To create an AI voice clone, only a small voice sample is required—often only a minute or less. Samples of voices are often taken from public social media posts, makingvideo bloggers and influencers particularly vulnerableas the audio quality of their posts is high quality and readily available.

Once scammers clone the voice sample, they use text-to-speech or even real-time speech-to-speech to create fake calls and voice messages.

Having accessed the victim’s social media and pieced together their personal details, the scammer creates a believable scenario. This can be a hospitalization, arrest, theft, kidnapping—anything that creates fear. The cloned voice is then used to manipulate a family member into believing a loved one is in serious trouble.

The realism of the voice and emotions, combined with the shock and urgency of the request, can override skepticism, leading to hasty decisions to transfer money or provide sensitive information.

Why AI Voice Cloning Scams Are So Effective

Like most scams targeting family and friends, the AI voice cloning scam is effective as it exploits personal bonds. The panic of receiving a distress call from a loved one can quickly cloud judgment, resulting in rash decisions. It’s the same tactic used inquid pro scams, but the effect is compounded by hearing the voice of someone close to you.

Family members of those with a high social media presence or those traveling are particularly at risk because their situations provide a plausible context for the scammer’s fabricated narrative. Scammers will often harvest and use the private data of their targets from their public-facing social media accounts ahead of time, making the stories even more convincing.

Key Signs of an AI Voice Clone Scam

Despite their sophistication, AI voice clone scams can reveal themselves through specific red flags:

Considering these factors when presented with a terrible scenario is difficult, but it could stop you from being scammed.

What to Do if You Suspect an AI Voice Clone Scam

If you receive a call or voice message from your loved ones in distress, stay calm. Scammers are relying on you to panic and become emotional, as this increases your vulnerability to their tactics.

If You Receive a Phone Call

If you suspect, even for a second, that a call may be a voice cloning scam, immediately hang up the call. Do not share any information, and do not engage with scammers. Instead, call your loved one back on a known number to confirm the authenticity of the distress call.

Conversing further with the scammers can risk having your voice recorded and cloned as well.

If You Receive a Voice Message

If the scammers have left a voice message, save the audio on your phone or laptop immediately. This sample of audio can be used in an AI speech classifier tool to determine whether it is a human voice. These AI speech classifiers work just like AI writing detectors. Simply input the audio file, and the classifier will mark it as human or AI-generated.

A great tool for performing this check is the ElevenLabs speech classifier. ElevenLabs powers most AI cloning platforms, so it’s an ideal tool to determine whether a voice has been cloned. Here’s how you can use it to determine an AI-cloned voice:

In the example below, a distressing message was recorded by a human and uploaded to the classifier. You can see it has been marked as having a 2% probability of being manipulated.

In the next example, the same distressing message was produced with an AI voice cloning tool. To the human ear, the voices are indistinguishable. The tool, however, has marked it as having a 98% probability of being AI-generated.

While you may’t rely on this classifier 100%, it can confirm your suspicions of a cloned voice being in use. You should also contact your loved one on a known number to confirm.

Prevention Is the Best Defense Against AI Voice Cloning Scams

Sometimes, the best defense against a high-tech scam is a low-tech solution.

Establish an offline password only known to loved ones with your family and friends. In the event of a frantic call, using this password can be a surefire way to confirm the identity of the caller.

Finally, keeping your private information off of social media is a huge step toward ensuring scammers can’t glean enough information to construct a narrative. Remember: If you don’t want it known to everyone, it’s best not to share it on social media.