How Scammers Use Voice Cloning

Your voice isn’t private anymore. It’s the perfect impersonation weapon, and you’re near a microphone nearly 24x7.

Black and white photo of a young boy yelling into a 1950s style microphone. He has short dark hair and. is wearing a white t-shirt.

Guest post from Jay Jones

AI voice cloning isn’t a fantasy anymore. And it’s not hard to do. Scammers can feed 3–30 seconds of audio of you or your loved ones into free tools that render a near-perfect digital twin of how you talk, laugh, pause, and breathe. They type any script they want.

Imagine getting a call and hearing your child say, “Dad, they kidnapped me and need $5,000 to let me go.” Your protective instincts kick in, you believe it, and your bank account is empty in minutes.

Ten minutes later, your child calls to see how you’re doing. That’s when you realize you were scammed.

According to a study at the University of California, Berkeley, “participants perceived the identity of an AI-generated voice to be the same as its real counterpart approximately 80% of the time, and correctly identified a voice as AI generated only about 60% of the time.” It's perfect social engineering.

How Scammers Steal and Weaponize Voices

They grab your audio from everywhere.

They don’t need to be expert hackers. It’s easy for people to find your voice online. TikTok rants, LinkedIn Lives, Instagram Stories, podcasts. Even robocalls that prompt you to say a specific word or phrase to speak to an agent mean you're recording yourself for them.

AI learns your biomarkers.

There are paid and open-source tools that analyze pitch, cadence, accent, emotion. The tools “train” on a sample of your voice in seconds, then generate new speech from any text. They can do it in real time for live calls or prerecord content for longer messages.

Scammers hit where it hurts.

Scam scripts focus on creating fear and a sense of urgency in targets so they’ll act emotionally instead of taking time to evaluate what’s happening.

  • Family panic: The cloned voice of your child or spouse begs for emergency cash. They’ve been arrested or kidnapped. And they need you to transfer money or pay at a crypto ATM right away.

  • CEO impersonation: You receive a call from your boss instructing you to make a $35 million wire transfer. It may sound over-the-top, but it happened in 2021. Often cited as the $35M CEO fraud benchmark and covered in a Forbes article, scammers cloned a company director’s voice to authorize massive transfers from a Japanese firm’s Hong Kong branch.

  • Extortion: Scammers clone your voice to create fake audio of you confessing to installing malware, sharing compromising information, or committing a crime. They then contact you and play the fabricated “admission,” demanding payment to not release it to law enforcement, your family, or your employer. It’s often called “self-extortion” because the evidence is manufactured from your voice, amplifying panic since it sounds undeniably like you.

  • Job scam: Fake recruiters use your voicemail clone to confirm interviews, then phish deeper.

Protect Yourself from Voice Scams

🔒 Create a family code word: Protect each other by creating code phrases to confirm that only insiders know. Change it often. For instance, if one person says “peanut butter,” the other person knows to answer with “jelly.”

📞 Callback rule: If something seems suspicious — especially if it’s an automated system asking you to respond by voice — hang up, and call back on a verified number. You’re not being rude, you’re being cautious.

🔉 Privatize social audio: Set access to your friends and known followers only; delete old recordings.

🗣️ Voice biometrics opt-out: Avoid using your voice as a password where possible; use passphrases banks can’t predict.

Common Ways Scammers Capture Voices

People often think voice cloning happens only to “high-profile” targets like celebrities and executives. Wrong. Here’s how scammers can use everyday exposure to build their dataset:

  • Social media: One 15-second clip is enough to gather a voice sample from videos you post to social media, such as Instagram, TikTok, or Facebook.

  • Podcasts or webinars: Your guest spot or Q&A audio lives forever online.

  • Answering robocalls: “Say yes to continue," or interactive voice responses get clear recordings.

  • Video calls: Zoom, Webex, and Teams meetings or clips that are posted, shared, or hacked.

  • Voice assistants: Alexa, Siri, and Google devices store and train on your speech patterns, whether it’s the device in your home or the phone in your hand.


🎙️

See How it Works

Go to a site like Eleven Labs or Voiceslab to listen to the sample AI voices. Or, if you want to, record your own voice to see what a cloned version sounds like.


Other Apps that Use Your Voice

Some of these applications and tools have become part of everyday activities and can also serve as sources for voice thieves.

  • Fitness apps: Peloton, Freeletics, and similar programs record guided workouts, which are often cloud-synced.

  • Language apps: Duolingo and Babbel log pronunciation practice to help you improve your skills (and their AI training).

  • Vehicle infotainment: Syncing phone calls or voice commands to vehicle dashboard systems.

  • Smart appliances and TVs: Samsung Family Hub, LG webOS, use ambient listening.

  • Gaming voice chat: Discord, PSN, and Xbox party audio can be clipped, shared, or breached.

  • Telehealth: Virtual doctor visits, and now some in-office systems, record speech for transcription and store it indefinitely.

  • Job interview recorders: Platforms like HireVue use recordings and analyze voice biometrics.


Jay Jones is a copywriter, cybercrime investigator, and fraud prevention expert, also known online as “The Profiler.” He’s removed 51,000+ fake jobs and 7,000+ fraudulent profiles from LinkedIn, protecting thousands worldwide. He’s internationally known and a pioneer in platform accountability, with his work featured on NBC and Yahoo News.

More Stories from Our Blog

Previous
Previous

Tax Scams to Avoid in 2026

Next
Next

How Meta Responds to Impersonation on Facebook