What Are Romance Scams And How To Protect Yourself From Them?

Main Image
  • Like
  • Comment
  • Share

The advancements in AI benefit not only regular users but also bad actors and scammers. Romance scams have been around for quite some time, where someone gains the victim’s trust before asking for a short loan. Previously, the scammers relied on text messages or voice calls to talk to the victim and establish a rapport. However, a new report discusses how such people have gained access to highly sophisticated face-swapping apps.

What Are Romance Scams?

In your time on social media and instant messaging platforms, you might have encountered several fake accounts or profiles. More often than not, the account owners use such profiles to initiate conversations with strangers.

Over time, bad actors gain their trust, and once they sense that the victim is romantically interested, they try to extort money by either pretending to be in a financial emergency or blackmailing them.

Scammers Are Using Live Deepfakes To Trick Their Victims

In the past, scammers mostly relied on texts or voice calls to lure in their prey. However, according to a report by Wired, romance scammers known as Yahoo Boys are using face-swapping apps to fool their victims.

First, the scammers use a setup of two smartphones, where one (with its rear camera) faces the other’s screen, which in turn is running a face-swapping app (with its camera on the scammer). The report also mentions using camera stands and ring lights to make the videos look more realistic.

In the second method seen in a video by the publication, scammers use a web camera to capture their face, which is changed into someone else’s by using AI-powered software. The video reveals that scammers can see their faces beside the altered deepfake. However, the victim only sees the modified version.

What’s more surprising is that the publication didn’t have to do a sting operation or send in underground investigators to obtain the video. Instead, scammers boast about their deepfake and image manipulation skills on Telegram groups, along with video footage.

Ways To Protect Yourself

Face swap tech

The first thing anyone should do is make a new thumb rule that they should never pick up audio or video calls from unknown numbers, not on WhatsApp or any other platform. Unless it’s a pre-scheduled call with someone you know, through a professional or a personal relationship, accepting random voice/video calls, especially from international numbers, could be dangerous.

Since romance scammers are more active on dating apps, people using such apps should be more careful about the people they interact with, especially if someone is hinting toward a financial crisis and might need some money from you.

With easy access to face-swapping or modification technology, scammers aim to increase your trust before they make a move and get some money out of your pockets. If you notice weirdly cropped or shaped images on the profile, it could be a sign that the person is impersonating someone else.

Even though scammers use pitch-bending tools, you can detect the weird robotic hint in their voices. Yes, it is difficult not to fall prey to such scams, where bad actors can pretend to be someone you know on voice or video calls, but with enough vigilance and presence of mind, one can protect themselves.

You can follow Smartprix on TwitterFacebookInstagram, and Google News. Visit smartprix.com for the most recent newsreviews, and tech guides.

Shikhar MehrotraShikhar Mehrotra
A tech enthusiast at heart, Shikhar Mehrotra has been writing news since college for an undergraduate degree in Journalism and Mass Communication. Over the last four years, he has worked with several national and international publications, including Republic World, and ScreenRant, writing news, how-to explainers, smartphone comparisons, reviews, and list-type articles. When he is not working, Shikhar likes to click pictures, make videos for his YouTube channel, and watch the American sitcom Friends.

Related Articles

ImageExclusive: Samsung Galaxy Watch7 Ultra 5K Renders; Say Hello to Squarish Design

Samsung’s Galaxy Watch series is one of the most popular WearOS watches on the market. The Galaxy Watch Classic series, though, has a separate fanbase due to its elegant design and, most prominently, the rotating bezel. We’re happy to report that Samsung is continuing the Classic series with the Galaxy Watch7 Ultra, but with a …

Image*401# Call Forwarding Scam: What is it & How Can You Stay Safe?

Cybercriminals these days keep executing various scams to dupe users by stealing their money or breaching their privacy. One recent scam that has been gaining momentum is the Call Forwarding Scam happening with Jio, Airtel, and other telecom users. *401# call forwarding scam includes scammers posing as customer service executives aiming to exploit unsuspecting users. …

ImageUnwanted WhatsApp Calls? How to Stop Spam and Fraud from +84, +62 Country Codes

WhatsApp has recently been in the spotlight due to an increase in scams involving international calls and messages. Scammers are exploiting the platform to deceive users, often targeting them with fraudulent job offers or enticing tasks. To help you understand this issue and safeguard your online presence, we have compiled this comprehensive FAQ. Q: Why …

ImageGoogle I/0 2024: AI-Powered Scam Call Detection Arrives on Android Phones This Year

At the I/O developer conference on Tuesday, Google announced that it is working on implementing advanced security measures to protect Android users from telephone scams. Google is testing a new call monitoring feature that will alert users of potential fraudulent activity during calls and prompt them to end the communication. The new feature, powered by …

ImageAI Voice Cloning Scams: How to Identify AI Voice Clone Calls and Be Safe from Such Scams

Generative AI has been at the forefront of innovation with several easy-to-access tools like ChatGPT, DALL-E, MidJourney AI, Gemini, and a lot more. Not only these, there are also plethoras of AI voice cloning software which have emerged with capacities of real-time AI voice cloning of others. Scammers and criminals have started using these AI …


Be the first to leave a comment.