Virtual kidnapping: What it is and how to avoid falling victim

It's every parent's worst nightmare. You receive an incoming call from an unknown number and on the other end of the line you hear your child's voice calling for help. The supposed "kidnapper" then asks you to pay a ransom or you will never see your child again. hacker

 

“Unfortunately, this is no longer one fantastic scenario from a Hollywood movie" warns o Phil Muncaster from the global digital security company's team ESET. "It's an example of how far fraudsters can now go to extort money from their victims using new technologies. It shows, too, its quality human voice cloning technology by artificial intelligence which is now convincing enough to fool even close family members. Fortunately, the more people we know about these scams and what to look out for, the harder we make life for the scammers who use them to make money.”

How virtual kidnapping works

There are basic stages in a typical virtual kidnapping scam. In general they are as follows:

  • Fraudsters are looking for the candidate victims which they could call and blackmail them to extract money. This stage could also be optimized using artificial intelligence tools (more on that later).
  • Soon after, the scammers are ready to locate the potential “kidnapping” victim – most likely the child of the person they identified in stage 1. They can do this simply by searching social media or locating other information that is publicly accessible.
  • Then they draw one fantastic scenario, making sure to make it look as convincing and scary as possible. The more scared the victim is, the harder it will be to make rational decisions. Like any good social engineering attempt, scammers want to rush the victim into a decision.
  • Scammers may then research open sources to figure out when would be the best time to call. They may scan social media or other sources to determine this. The idea is for them to contact you while your loved one is away, ideally on vacation, such as her daughter Jennifer DeStefano.
  • Immediately after, the scammers create the audios deepfakes and make the call. Using readily available software, scammers will create an audio file of the victim's "voice" and use it to try to convince you that they have kidnapped a relative. They may also use other information they gather from social media to make the scam sound more convincing, for example by providing details about the "kidnapped" that a stranger might not know.

If you fall for the scam, they will likely ask you to pay the ransom in an untraceable way, such as with cryptocurrency.

Variations of this method

There are variations on this theme. More worrying is the potential for artificial intelligence tools like the Chat GPT make it easier for fraudsters to find the ideal victims. How is this possible? Advertisers and marketers use for years "behaviour modeling" techniques to send the right messages to the right people at the right time.

Productive artificial intelligence (GenAI) could help fraudsters do the same by looking for the profile of people who are most likely to pay if they fall victim to a virtual kidnapping scam. They could also search for people within a specific geographic area, with public social media profiles, and with specific socioeconomic backgrounds.

A second option would be to use an attack SIMs. swapping (when a cybercriminal obtains a copy of a user's SIM card) to hijack the phone number of the alleged "kidnapped" prior to the fraud. This would make the "kidnapping" phone call seem even more plausible. While DeStefano she was finally able to determine that her daughter was safe and sound and therefore hang up on the extortionists, which would have been much more difficult to do if the victim's relative was unable to contact them.

What does the future hold for voice cloning?

Unfortunately, voice cloning technology is already disturbingly convincing, as evidenced by the recent experiment carried out by its Cyber ​​Security Advisor ESET, James Moore. And it is becoming more and more accessible to fraudsters. A report published in May warned of legitimate text-to-speech tools that could be misused, as well as cybercrime's growing interest in voice cloning as a service (VCaaS). If this scenario materializes, it could enable cybercriminals to launch such attacks, especially if used in conjunction with tools GenAI.

In fact, except for misinformation, technology deepfake also used to hack business email (as tested by Jake Moore, from ESET) and for sexual extortion (sextortion). We are only at the beginning of a long journey.

How to stay safe

The good news is that being informed can go a long way in reducing the threat of deepfakes in general, and virtual kidnappings in particular. According to Muncaster from ESET, there are things you can do today to minimize your chances of being singled out as a potential victim and falling for a phone scam if it does happen.

Consider these 7 tips:

  1. Do not share personal information on social media. This is absolutely critical. Avoid posting details such as addresses and phone numbers. If possible, don't share photos or videos/recordings of your family and certainly not details of your loved ones' holiday plans.
  2. Keep your social media profiles private, in order to minimize the chances of threat actors finding you online.
  3. Be on the lookout for phishing emails which could be designed to trick you into providing sensitive personal information or passwords to social media accounts.
  4. Encourage your children and close relatives to install geolocation apps.
  5. If you get a call, keep the "hijackers" on the phone. At the same time try to call the alleged victim on the other line or have someone you know do it.
  6. Keep cool, don't share personal information and if possible have the scammers answer a question that only the abductee would know and ask to speak with them.
  7. Notify the police as soon as possible.

iGuRu.gr The Best Technology Site in Greecegns

Get the best viral stories straight into your inbox!















Written by newsbot

Although the press releases will be from very select to rarely, I said to go ... because sometimes the authors are hiding.

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).