Generative AI has been at the forefront of innovation with several easy-to-access tools like ChatGPT, DALL-E, MidJourney AI, Gemini, and a lot more. Not only these, there are also plethoras of AI voice cloning software which have emerged with capacities of real-time AI voice cloning of others. Scammers and criminals have started using these AI voice clone software to manipulate voice to scam your loved ones.
The voice is extremely realistic, often indistinguishable from the actual person’s voice. This level of authenticity, coupled with the emotional impact of hearing a familiar voice in distress, can lead people to act quickly without verifying the authenticity of the situation. So, how do you identify such a scam and what measures should you take? Let’s find out.
Page Jumps:
- How AI Voice Cloning Scams Work?
- How to Identify An AI Voice Clone Scam?
- How to Protect Yourself From An AI Voice Clone Scam?
How AI Voice Cloning Scams Work?
A person’s voice can be convincingly cloned using AI models like ElevenLabs in a voice cloning scam. Scammers can use these replicas or deepfakes, called voice clones, to imitate the voice of a friend or family member they’re trying to con.
Take the example of a hypothetical situation where a scammer tries to fool a victim into thinking they are getting a distress call from their grandchild. The scammer could imitate the grandchild’s voice using AI voice cloning models and then use it to fool the grandparents into believing that their grandchild needs financial help.
The ease of access to such free AI voice cloning tools is one reason why these scams have grown in number. The scammer would only need a small sample, usually a minute or less, of the target’s voice to generate a voice clone.
Apple has also developed a similar feature called Personal Voice on their iPhones running iOS 17 or later. If someone gets their hands on your iPhone, they can easily use it for unethical and criminal purposes.
ALSO SEE: ZomHom Site 4G to 5G Convert SIM: An Easy Upgrade or a Clever Scam?
How to Identify An AI Voice Clone Scam?
Fortunately, it’s easy to identify an AI voice clone scam if you stay calm and pay a little attention. If you ever get a distress call from an unknown number, here’s how to decide whether it is genuine or an AI voice clone scam:
1. Urgency
Scammers frequently instill a feeling of urgency claiming that quick action is necessary to assist a loved one in need. They could try to rush you into doing anything without giving you a chance to assess the matter properly.
2. Inconsistencies
Pay close attention to any discrepancies that may exist in the caller’s account or the information they have given you. Scammers can have gaps in their storyline because they don’t have complete information regarding your relationship with the purported victim.
3. Unusual Requests
Be cautious of unusual demand for payment, such as cryptocurrencies or wire transfers. Scammers frequently use these techniques to evade identification and tracking.
4. Voice Quality
Even though AI voice cloning technology has come a long way, there are a few discrepancies between the cloned and original voices. Feel free to go with your gut if you sense that something is off about the voice.
5. Ask Other Relatives or Call Directly
Get in touch with other relatives or the purported victim directly at an unknown phone number to make sure they’re okay.
READ MORE: Remaker AI Face Swap and Generative Edit: Should You Use It?
How to Protect Yourself From An AI Voice Clone Scam?
If you have identified that the distress call that you’ve received is indeed a voice clone scam, there’s nothing you need to worry about. It is the scammer who needs to be worried now. Here are some measures you can take.
1. Stay Calm
First of all, make sure that you are calm and not panicking. Rest assured that there’s nothing to panic about.
2. Hang up
If you receive a voice clone phone call, hang up immediately and report the number to the cyber police or the concerned authorities in your area. You can try recording the call for evidence, if possible. If you receive a voicemail, simply save it and report it to the authorities.
3. AI Speech Classifier
If you really want to make sure that a call or the voice message that you have received is fake, you can use ElevenLabs’ AI speech classifier. This works similarly to the AI text classifier and can detect AI-generated voices to a certain extent.
While it is not 100% accurate, it can help you detect if the call or voice message is actually a scam or not. You can upload voice messages directly. For calls though, you have to record them and then upload them to the AI voice classifier.
Again, it is highly recommended to call the purported victim directly and make sure they are in distress or not. To avoid such confusion in the future, you can share a secret code with your loved ones that they would tell you whenever they are in need of help.
ALSO CHECK: Best 5G phones under Rs 15,000
Wrapping Up
AI has gotten extremely good at producing realistic voice clones of people. This has opened doors to a new category of scammers aiming to extract money and personal information out of someone by cloning their loved one’s voice.
Beware of such calls from unknown numbers, and if you suspect a scam, cross-question the caller to test if they are genuine. If you find out that you are being scammed, immediately cut all contact from that number and report the incident to the authorities.
You can follow Smartprix on Twitter, Facebook, Instagram, and Google News. Visit smartprix.com for the most recent news, reviews, and tech guides.