说明:双击或选中下面任意单词,将显示该词的音标、读音、翻译等;选中中文或多个词,将显示翻译。
Home->News->World->
AI becoming a handy tool for US fraudsters
2023-07-28 

Technology being employed to clone people's voice for ransom, govt warns

People in the United States are being warned to stay vigilant against a growing number of scams using artificial intelligence that mimics a person's voice during a phone call to a concerned relative or friend, who is then asked to send money for ransom.

The Federal Trade Commission, or FTC, issued the consumer warning alert this year after an increase in the number of people reporting they had been asked to send money after receiving a frantic phone call from a person who they believed was their loved one but was in fact a cloned voice using AI.

Jennifer DeStefano from Scottsdale, Arizona, experienced the crime firsthand. She told a US Senate judiciary hearing last month that she got a call from an unlisted number in April, and when she picked up, she could hear her daughter, Briana, crying.

"Mom! I messed up," her daughter said sobbing on the phone call.

DeStefano asked her daughter, "OK, what happened?"

She then heard a man's voice on the phone telling her daughter to "lay down and put your head back".

He then told the worried mother: "Listen here, I have your daughter. You tell anyone, you call the cops, I am going to pump her stomach so full of drugs."

DeStefano was at her other daughter Aubrey's dance rehearsal when she picked up the phone. She put the phone on mute and asked nearby parents to call 911.

The scammer first asked her to send $1 million, but when she said she did not have access to that much money, he asked for $50,000 in cash and arranged a meet-up spot.

The terrified mother said the man on the phone told her that "if I didn't have all the money, then we were both going to be dead".

However, she contacted her husband and daughter and found out Briana was safe, and it was a hoax.

Cybercrimes on rise

Last year, frauds and scams rose 30 percent compared with the previous year, the FTC said. Cybercrimes are also increasing with losses of $10.2 billion last year, the FBI said.

Scammers use AI to mimic a person's voice by obtaining "a short audio clip of your family member's voice from content posted online and a voice-cloning program", the consumer protection watchdog said. When they call, they will sound just like the person's loved one.

In another scam, a Canadian couple was duped out of C$21,000($15,940) after listening to an AI voice that they thought was their son, The Washington Post reported in March.

According to a recent poll by McAfee, an antivirus software organization in San Jose, California, at least 77 percent of AI scam victims have sent money to fraudsters.

Of those who reported losing money, 36 percent said they had lost between $500 and $3,000, while 7 percent got taken for anywhere between $5,000 and $15,000, McAfee said.

About 45 percent of the 7,000 people polled from nine countries — Australia, Brazil, France, Ger — many, India, Japan, Mexico, the United Kingdom and the US — said they would reply and send money to a friend or loved one who had asked for financial help via a voicemail or note.

Forty-eight percent said they would respond quickly if they heard that a friend was in a car accident or had trouble with their vehicle.

Although phone scams are nothing new worldwide, in this AI version, fraudsters are getting the money sent to them in a variety of ways, including wire transfers, gift cards and cryptocurrency.

Consumers are being encouraged to contact the person that they think is calling to check if they are OK before ever sending cash.

FTC Chair Lina Khan warned House lawmakers in April that fraud and scams were being "turbocharged" by AI and were of "serious concern".

Avi Greengart, president and lead analyst at Techsponential, a technology analysis and market research company in the US, told China Daily: "I think that it is hard for us to estimate exactly how pervasive (AI) is likely to be because this is still relatively new technology. Laws should regulate AI."

The software to clone voices is becoming cheaper and more widely available, experts say.

AI speech software ElevenLabs allows users to convert text into voice-overs meant for social media and videos, but many users have already shown how it can be misused to mimic the voices of celebrities, such as actress Emma Watson, podcast host Joe Rogan and columnist and author Ben Shapiro.

Other videos mimicking the voices of US President Joe Biden and former president Donald Trump have also appeared on platforms such as Instagram.

Most Popular...
Previous:Ten photos from China: July 21 – 27
Next:Bolshoi's return delights audiences