A UK bank, Starling Bank, has issued a warning about a new wave of scams that use artificial intelligence to replicate people’s voices. Fraudsters can create convincing voice clones from just a few seconds of audio, often found in online videos, the bank said in a press release.
The online-only lender said that these scams are highly effective, with millions of people potentially at risk. The bank’s survey found that over a quarter of respondents had been targeted by such scams in the past year, and many were unaware of the threat, CNN reported.
“People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters,” Lisa Grahame, chief information security officer at Starling Bank, said in the press release.
According to the survey, 46% of respondents weren’t aware that such scams existed, and that 8% would send over as much money as requested by a friend or family member, even if they thought the call seemed strange.
To protect themselves, people are advised to establish a “safe phrase” with their loved ones. This unique phrase can be used to verify identity during phone calls. The bank advised against sharing the safe phrase over text, which could make it easier for scammers to find out, but, if shared in this day, the message should be deleted once the other person has seen it.
As AI technology continues to advance, concerns about its potential for misuse are growing. The creators of ChatGPT, and OpenAI, have even acknowledged the risks associated with voice replication tools.