Using the Voices of Family: Criminals Deploy AI Voice-Generation in Sophisticated Scams

 

Scammers are increasingly resorting to the incorporation of AI Voice-Generation technologies to elevate the sophistication of ‘Vishing’ scams. A popular case involving a US victim illustrates the lengths scammers will go to exploit AI, convincingly mimicking the voices of family members. We’ll guide you with practical tips to stay alert and better protect yourself and your loved ones from this concerning tech threat.

 
 

As the development of Artificial Intelligence (AI) Voice-Generation technologies continues, so does their adoption by cyber criminals to assist in their lucrative Vishing scams (receiving a phone call from a criminal who tries to trick you into sharing information that can be used for the criminal's personal gain). Scammers can now source voice samples from publicly available recordings, social media, or any other online platforms where the targeted individual's voice may be accessible. This technology allows them to recreate and manipulate these samples with remarkable accuracy, enabling the convincing impersonation of family members and acquaintances.

Amidst the prevalence of such incidents, a notable case reported by The Hill in November 2023 sheds light on the extent of this issue: Philadelphia-based attorney Gary Schildhorn testified before a US Senate Special Committee describing how a scammer sought to gain $9,000 by replicating his son’s voice via AI voice generation technology.

Further details describe how the scammer first called Gary, utilising AI voice generation technology to impersonate Gary’s son. The caller, posing as Gary’s son, described how they had caused a car accident involving a pregnant woman and that they had been arrested and were now subsequently in jail. The caller then told Gary they were being represented by a public defender, Barry Goldstein.

The scammer then posed as public defender ‘Barry Goldstein’ and informed Gary that his son had been drinking alcohol and had failed an alcohol breathalyser test. To release his son from jail, Gary was advised by the impersonated public defender to pay 10% of the $90,000 bail amount ($9,000) via a Bitcoin cryptocurrency kiosk. 

Luckily for Gary, he contacted his son via the Facetime video calling application and verified that he was not in jail. He then realised, before sending any funds, that this was an elaborate scam.

This is one of many AI voice-generation assisted ‘vishing’ scams which have been publicly reported. The FBI notified the public in November 2023 that they had received 195 similar complaints, mainly targeting senior citizens, resulting in over $1.9 million in victim losses.


How to Protect Yourself Against These AI Voice-Generation Scams

  • Never answer calls from private or withheld telephone numbers; only answer calls from telephone numbers you recognise.

  • If you believe the caller may not be who they say they are, do not share any personal information with the caller and immediately hang up the call.

  • Hang up the call if the caller urgently demands funds or any other form of payment, especially cryptocurrencies (e.g. Bitcoin, Ethereum, Monero).

  • If the caller claims to be a family member or another individual, hang up and call the family member or individual using the contact number you have saved for the person.

  • Keep in mind some key points from our Vishing scam guidance:

    • A strange phone number - A vishing call will come from a number you haven’t seen before. It is unlikely to match the official number of the person or organisation they are pretending to be, though in some cases they may be able to spoof the number. Check the number by visiting the organisation’s website or comparing the telephone number which you have saved for the individual who is calling.

    • The creation of a sense of urgency - The caller often asks recipients to verify personal information, such as bank details or a password. They can create a sense of urgency by warning that your account has experienced suspicious activity or pretending to be someone you know who is in urgent need of financial help. These are massive warning signs. If you are ever unsure, contact the company or person using the contact details you already have for them or that are on their legitimate website. Never use any contact details provided by the caller.

    • Something doesn’t feel right - Trust your gut. If it doesn’t feel right, it probably isn’t. Hang up and contact the person or organisation directly.


Have you been a victim of an AI Voice-Generated Scam?

If you are in England, Wales or Northern Ireland you should report all cyber crime to Action Fraud. In Scotland, you can see details of reporting to Police Scotland here

Further information regarding Vishing scams can be found in our Vishing Guide .

Our team is here to provide the guidance you need to navigate the digital landscape. If you've experienced any cybersecurity crime or online harm, visit The Cyber Helpline for further support.