Vonny Gamot, Head of EMEA at McAfee, highlights a troubling trend where scammers are now using generative AI to sound like a loved one in distress, and details how we can all stay protected.

Everyone’s voice is unique. It’s the spoken equivalent of a biometric fingerprint, which is why hearing someone speak is such a widely accepted way of establishing trust. Last year, a particularly vicious type of scam called ‘Hi mum/Hi dad’ circulated on WhatsApp. Fraudsters targeted parents with messages pretending to be from their son or daughter, in a state of emergency, to encourage them to send money. And people obliged. According to figures from Action Fraud, between February 3rd and June 21st, 2022, victims lost a total of £ 1.5 million.

Sadly, this trick has been significantly elevated, thanks to AI

Cases have been reported in various countries over the last couple of months. Perhaps you saw the headlines about a faked kidnapping attempt in the U.S. where AI was used to clone a child’s voice? It’s a particularly egregious example. Most of the cases we’ve been learning of at McAfee involve a cybercriminal cloning a voice (likely lifted from social media), with them either calling or sending a voicemail to a loved one to say that they’ve been in a car accident or are in some kind of trouble and need urgent help… and money.

Concerned by the efficacy of this scam, we started to look into it a little more. We surveyed 7,054 people globally, including 1,009 adults in the UK, and found that almost a quarter of Brits say themselves or a friend have already experienced an AI voice scam like this. Nearly 80% of victims confirmed they had lost money as a result.

With 65% of adults not confident that they could identify a cloned version of a voice from the real thing, it’s no surprise that this technique is gaining momentum.

Voice cloning tools are freely available

During our research, we found more than a dozen freely available AI voice-cloning tools on the internet. Many requiring only a basic level of experience and expertise to use. In one instance, just three seconds of audio was enough to produce an 85% match, but with more investment and effort, it’s possible to increase the accuracy. By training the data models, we were able to achieve a 95% voice match based on just a small number of audio files.

The more accurate the clone, the better chance a cybercriminal has of duping someone into handing over their money or taking another requested action, and with these hoaxes based on exploiting the emotional vulnerabilities inherent in close relationships, a scammer could net thousands of pounds in just a few hours.

Where do they get your phone number, you might ask? When a company is targeted by a cyberattack and data is stolen, that data could include your personally identifiable information (PII). Hackers often sell PII on the dark web for other cybercriminals to use and launch further attacks. It’s therefore important to use identity monitoring, which can make sure your PII is not accessible or notify you if it does make its way to the dark web.

Start by asking a question only they would know

If you do get a call or voicemail from someone asking for help and money, just pause and think. Does that really sound like them? Even if it’s a number you recognise, as it may have been spoofed (faked). It might be wise to use a previously agreed codeword or ask a question only they would know. Also remember, cybercriminals are betting on emotions running high. They will play on your connection to the loved one and create a sense of urgency to prompt you into action. Try to remain level-headed and pause before you take any next steps.

And to prevent your own voice from being cloned by scammers – think before you click and share online. Who is in your social media network? Do you really know and trust them? Be thoughtful about the friends and connections you have online. The wider your connections and the more you share, the more you may be at risk of having your identity cloned for malicious purposes.

Wait before sharing your holiday videos!

Lastly, if you’re on holiday for example, wait until you’re at home before you share content. If your family members receive a call from “you” to say that you’ve been in a car accident on holiday, but they know you’re now at home, the scam has a much lower chance of being successful.

Generative artificial intelligence brings incredible opportunities, but with any technology there is always the potential for it to be used maliciously in the wrong hands. Thankfully, by taking a few small precautions, we can continue living our online lives, confidently and securely.