Starling Alerts Public to Increasing Voice Cloning Scam Threats
Read Time:2 Minute, 0 Second

Starling Alerts Public to Increasing Voice Cloning Scam Threats

Voice cloning scams, which involve fraudsters using AI technology to imitate the voices of friends or family members, may soon affect millions, according to new research from Starling Bank.

The study reveals that over a quarter (28%) of UK adults have been targeted by an AI voice cloning scam at least once in the past year. Starling highlights that scammers can replicate an individual’s voice using just three seconds of audio, which can easily be obtained from videos shared online or on social media.

These scammers can then reach out to the person’s relatives, using the cloned voice to make phone calls, send voice messages, or leave voicemails asking for urgent financial assistance. Alarmingly, nearly 10% of survey respondents stated they would comply and send money in such situations, despite any doubts about the authenticity of the call.

Despite the growing risk of such fraud, only 30% of individuals feel confident in recognizing the signs of a voice cloning scam.

In response to this threat, Starling Bank has launched the Safe Phrases campaign, supporting the government’s Stop! Think Fraud initiative. The campaign encourages individuals to establish a “Safe Phrase” with close friends and family—an exclusive phrase that only they know—to verify their identity during phone conversations.

Lisa Grahame, chief information security officer at Starling Bank, stated, “Many people share recordings of their voice online without realizing it increases their vulnerability to fraud. Implementing a Safe Phrase with trusted individuals—one that is never shared digitally—provides a simple and effective method to confirm who is on the phone.”

To promote the campaign, Starling Bank has enlisted renowned actor James Nesbitt, who had his voice cloned using AI technology to highlight the ease with which anyone can fall victim to such scams.

Nesbitt remarked, “I consider my voice to be quite distinctive, and it’s a key part of my career. Experiencing an accurate clone of it was startling. There’s much talk about AI, but this experience has truly made me aware of the technology’s advancements and its potential misuse. As a parent, the thought of my children being scammed in this manner is frightening. I will certainly establish a Safe Phrase with my family and friends.”

For those interested in exploring the challenges and opportunities presented by artificial intelligence in banking, more information is available about Finextra’s first NextGenAI conference taking place on November 26, 2024.