Can you trust your ears? Cyber criminals are using AI voice tools in new scam

Published: 
Listen to this article
  • Fraudsters are using widely-available AI voice cloning technology to steal from people by impersonating their family members and making up emergencies
  • ‘Nearly anyone with an online presence is vulnerable to attack,’ Berkeley professor warns
Agence France-Presse |
Published: 
Comment

Latest Articles

Hong Kong’s Global 6K for Water run highlights need for clean water

Kenyan farmers find unique solution to elephant invasions with honeybees

Typhoon Toraji approaches Hong Kong as four storms swirl in historic November

8 quotes from classic young adult novels with great life advice

Melioidosis kills over 10 monkeys at Hong Kong Zoo

In a new breed of scams that has rattled authorities in the US, fraudsters are using strikingly convincing AI voice cloning tools to steal from people by impersonating family members. Photo: AFP

The voice on the phone seemed frighteningly real – an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

The biggest peril of artificial intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools – widely available online – to steal from people by impersonating family members.

Apple unveils Vision Pro, its US$3,500 mixed reality headset

“Help me, mom, please help me,” Jennifer DeStefano, a mother in the US state of Arizona, heard a voice saying on the other end of the line.

DeStefano was “100 per cent” convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

“It was never a question of who is this? It was completely her voice ... it was the way she would have cried,” DeStefano told a local television station in April.

“I never doubted for one second it was her.”

What is data analytics, and why do companies gather user information?

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to US$1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

“AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively,” Wasim Khaled, chief executive of Blackbird.AI, told Agence France-Presse.

The biggest peril of artificial intelligence, experts say, is its ability to blur the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to exploit. Photo: AFP

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample – sometimes only a few seconds – of a person’s real voice that can be easily stolen from content posted online.

“With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls,” Khaled said.

“Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes.”

AI pioneer says the tech’s threat may be ‘more urgent’ than climate change

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy per cent of the respondents said they were not confident they could “tell the difference between a cloned voice and the real thing,” said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the “grandparent scam” – where an impostor poses as a grandchild in urgent need of money in a distressful situation.

What is Web 3.0? From semantic web to blockchain, learn about the developing tech behind future of internet

“You get a call. There’s a panicked voice on the line. It’s your grandson. He says he’s in deep trouble – he wrecked the car and landed in jail. But you can help by sending money,” the US Federal Trade Commission (FTC), a government agency devoted to consumer protection, said in a warning in March.

“It sounds just like him. How could it be a scam? Voice cloning, that’s how.”

In the comments beneath the FTC’s warning were multiple testimonies of elderly people who had been duped that way.

That also mirrors the experience of Eddie, a 19-year-old in Chicago, in the US state of Illinois, whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

Cyber criminals are using AI voice clones to pretend to be family members of their victims and ask for money. Photo: Shutterstock

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered remortgaging his house, before the lie was discovered.

“Because it is now easy to generate highly realistic voice clones ... nearly anyone with any online presence is vulnerable to an attack,” Hany Farid, a professor at the UC Berkeley School of Information in California, told Agence France-Presse.

“These scams are gaining traction and spreading.”

How ChatGPT, Dall-E and other generative AI tools could change everything – from work to art and education

Earlier this year, AI start-up ElevenLabs admitted that its voice cloning tool could be misused for “malicious purposes” after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler’s biography Mein Kampf.

“We’re fast approaching the point where you can’t trust the things that you see on the internet,” Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told Agence France-Presse.

“We are going to need new technology to know if the person you think you’re talking to is actually the person you’re talking to,” he said.

Sign up for the YP Teachers Newsletter
Get updates for teachers sent directly to your inbox
By registering, you agree to our T&C and Privacy Policy
Comment