AI Got Your Tongue?

AI Got Your Tongue?

AI-Got-Your-Tongue-Banner-Image

Introduction

Did you know that cybercriminals can impersonate real people’s voices…and then clone them, so you think you’re listening to somebody else?

Voice cloning is absolutely real…and uses artificial intelligence to recreate a target’s voice by replicating audio files of the real person.

In other words, you may think you’re talking to your boss or local politician or even your best friend…but it could really be a bunch of special software deepfaking their vocal likeness.

This could make any phishing scam seem a whole lot more plausible!

How Does AI Clone Voices?

Hearing a familiar voice, especially from an authority or loved one, lowers defenses and makes you more likely to believe their claims! By choosing a specific victim, the threat actor can pick somebody close to them and start to find clips of their voice.

Where could someone find your voice online? Social media posts, voicemails, leaked recordings and even public speeches are all commonly posted online. By using these audio files as a basis, threat actors can then use specialized software to commit the auditory deepfake. Analyzing and learning from unique characteristics like pitch, cadence and pronunciation, the AI can replicate these patterns.

Once trained, the AI model can synthesize new speech based on the learned patterns. That is to say, they can make this stolen “voice” to say things that the original audio file never said. The criminal provides text for the clone to say, and the AI generates audio that sounds remarkably similar to the target voice.

As you can imagine, this makes it much easier for cybercriminals to bypass your suspicions and manipulate you.

Why Is Voice Cloning So Effective?

Threat actors can even script the cloned voice to sound distressed, panicked, or authoritative, creating a sense of urgency that pushes victims to act quickly without thinking critically. Talk about something out of a dystopian novel!

Some common voice cloning schemes include…

  • Impersonating CEOs or managers to trick employees into transferring funds or revealing sensitive information.
  • Fake friend or family emergencies that scam victims into sending money to supposedly help a loved one in trouble.
  • Virtual kidnapping uses a cloned voice to pretend that a loved one is in trouble, even mimicking their pleas for help.
  • Romance scams build rapport and trust with a victim over time using a cloned voice, so that it is less obvious that they are a catfish.

By cloning someone specific to their victim, the scam becomes more targeted and therefore believable. In light of this technology, it’s important to be wary of unsolicited calls or messages, even if they sound familiar.

Verify requests and information directly from the source, never just over the phone or video call. Remember, even as AI voice cloning becomes more sophisticated, vigilance and healthy skepticism are your best defenses!

Conclusion

How can you stay safe from AI voice cloning scams?

  • Be wary of unsolicited calls or messages, even if they sound familiar.
  • Verify information directly with the source, not through the caller.
  • Never share personal information or send money based on phone calls or texts.
  • Be aware of the potential for AI voice cloning and stay informed about new scam tactics.

Meanwhile, the best way to stop your voice from being harvested for AI cloning is to avoid posting original audio online! When you do post audio, try to change your profile settings to limit who can view, interact with and download your posts. What threat actors can’t get, can’t come back to bite you!