Hey Mum, there’s an AI scam that can impersonate my voice

There was nothing unusual about Nina Merrilees getting a WhatsApp message from her daughter about how she had broken her phone. But mums like her are now being targeted with a new wave of technology that allows scammers to use artificial intelligence to impersonate the voices of loved ones.

Merrilees, who lives on the NSW/Victorian border, said her daughter has been overseas for seven years and was “forever breaking her phone or losing her phone”.

“To get a message saying, ‘Hi Mum, my phone is broken, this is my new number’ – we’ve had quite a few of those over the years,” Merrilees said.

Nina Merrilees was duped by a voice mimicking scam.
Nina Merrilees was duped by a voice mimicking scam.

Merrilees quickly responded to the call for help and paid more than $11,000 for what she thought were her daughter’s bills. She is among tens of thousands of Australians who lose billions of dollars to scammers each year.

In a new twist, scammers are now using artificial intelligence to sound like family members distressed and needing quick access to thousands of dollars in cash. Advancements in AI allow actors to impersonate a voice with an audio sample of just a few sentences. The scammer can then “speak” words they type.

Consumer Action Law Centre chief executive officer Stephanie Tonkin said voice mimicking was an example of how sophisticated scamming technology had become – and Australia was a soft target.

“This is a whole new wave of technology and the scammers are incentivised to keep on going and developing,” she said. “We expect more and more of this stealing technology to develop. It has run away from us.”

“We need our businesses, who have some control and oversight, to be equally innovative in protecting their own customers.”

Text messages sent to scam victim Nina Merrilees.
Text messages sent to scam victim Nina Merrilees.

A Canadian grandmother received a call that sounded like her grandson saying he was in jail and needed cash for bail. The woman and her husband transferred $3000 CDN ($3300) to a scammer before discovering her grandson’s voice had been faked. Their bank said another customer had been duped in the same way, TheWashington Post reported.

Merrilees said the use of AI to impersonate the voice of victims’ loved ones was “really frightening”.

“It’s all the more reason why banks need to step up how they respond,” she said.

After Merrilees responded to the message about a broken phone, it had made sense that her daughter would be setting up a new phone and could not talk while she was at work.

Merrilees was then asked to make some payments because they were urgent and couldn’t be made without a bank transfer verification on the new phone. Merrilees made a first payment of $3450, a second for $3800 and a third for $4350 – a total of $11,600.

“She was getting a golden retriever puppy and I thought obviously they have to pay for the dog because they are picking it up next week,” Merrilees said.

After making the third payment, Merrilees asked what it was for. When the reply was “new furniture” Merrilees felt uneasy.

“I emailed her, and she rang back on her old number and I felt physically ill,” Merrilees said. The bank said there was no guarantee she would get the money back because she had transferred the money herself.

Labor made an election commitment to combat scams with tougher industry codes and the creation of a National Anti-Scams Centre.

The Consumer Action Law Centre wants banks to be forced to reimburse scam victims to give them a greater incentive to invest in fraud protection. “Banks have the resources and should significantly increase their investment in technology to prevent and deter scammers,” Tonkin said.

Text messages sent to scam victim Nina Merrilees.
Text messages sent to scam victim Nina Merrilees.
Text messages sent to scam victim Nina Merrilees.
Text messages sent to scam victim Nina Merrilees.

Financial Services Minister Stephen Jones has said he does not fully support forcing banks to reimburse all scam victims because this could create a “honeypot for scammers”.

“Industry, including banks, should be held to a very high standard in relation to scam protection for consumers and where these standards are not met, they should be held to account,” his spokeswoman said.

A spokeswoman for the Australian Banking Association said it was developing an industry-wide position on customer losses to scams. She said voice impersonation was an emerging example of how scam types and methods were “rapidly evolving”.

“Scammers are getting increasingly sophisticated, and scams are getting more complex and all sectors must play a role in combating this problem including banks, telcos, online shopping platforms and search engines,” she said. “Banks consider the circumstance of each case – and will cover losses depending on how the scam was committed.”

A spokeswoman for the Australian Competition and Consumer Commission, which estimates Australians lost $4 billion from scams last year, said its Scamwatch service has seen growing sophistication in scams and “is alert to the risks AI presents”.

The Morning Edition newsletter is our guide to the day’s most important and interesting stories, analysis and insights. Sign up here.

Most Viewed in Business

Source: Thanks smh.com