A UK bank has warned that “millions” of people could be at risk of falling victim to scams involving artificial intelligence that clones their voices.
Starling Bank, a digital-only lender, explained that fraudsters can use AI to replicate someone’s voice from just three seconds of audio, such as a clip from an online video. Scammers can then identify the person’s friends and family members and use the cloned voice to make phone calls requesting money.
“These scams have the potential to deceive millions,” Starling Bank stated in a press release on Wednesday.
The bank noted that hundreds of people have already been affected. A survey of over 3,000 adults conducted with Mortar Research revealed that more than a quarter of respondents had been targeted by AI voice-cloning scams in the past year.
The survey also found that 46% of respondents were unaware of the existence of such scams, and 8% would send money to a friend or family member, even if the call felt suspicious.
“People frequently share content online featuring their voice, not realizing it could make them vulnerable to fraudsters,” said Lisa Grahame, Chief Information Security Officer at Starling Bank.
To protect against these scams, the bank recommends setting up a “safe phrase” with loved ones—a unique and memorable phrase that can be used for identity verification over the phone. They advise against sharing the phrase via text, but if it is shared this way, the message should be deleted once it has been seen.
As AI technology continues to advance in mimicking human voices, concerns are growing about its potential misuse, such as helping criminals access bank accounts or spreading misinformation.
Earlier this year, OpenAI, the creator of ChatGPT, introduced its Voice Engine tool for voice replication but did not release it to the public, citing the risk of synthetic voice misuse.