Clona la voce di un parente: la truffa dell'IA thumbnail

Clone a relative’s voice: the AI ​​scam

There has been a lot of talk lately about possible applications of artificial intelligence (AI) in the most varied fields. There are those who have exploited the technology to write songs, those who, like Microsoft, are implementing it in their search engines, and those who use it to pray with Padre Pio (all true, we told you about it here). However, artificial intelligence has already been used to carry out cyber attacks and, lately, also for more than one scam attempt.

The case of the spouses Grace and the young Brandon

The incident took place in the Canadian city of Regina, Saskatchewan, which is located in what is called the Northwest Territories. In practice we are a few kilometers from the US border. The victim of the scam was a 73-year-old woman, such Ruth Card, who received a call from what sounded like her nephew Brandon. He informed his grandmother that he was in prison, without a wallet and cell phone, asking the old woman for money to pay the bail.

Of course, as you may have guessed, the real Brandon wasn’t on the phone. The scammers have in fact used AI to simulate the young man’s voice, thus trying to defraud the woman. The fact is that the lady, totally unaware of the scam, immediately rushed to the bank with her husband Greg Grace (75 years old). The two picked up 3000 Canadian dollars (just over €2,000). Once they went to a second branch, in an attempt to circumvent the daily withdrawal limit imposed by Canadian law, the two elderly men were stopped by the bank manager.

The latter, worried about the suspicious movements of the couple, became interested in the reasons behind the withdrawal. After perceiving that the two elders might be at the center of a scam, the branch manager brought the two spouses back to rationalitywho thus understood that the real Brandon had never actually been in handcuffs.

“We totally fell for it,” Ruth Card told the Washington Post. “We were convinced we were talking to Brandon.”

A well-crafted scam using artificial intelligence

The episode in question is certainly not an isolated case. The Washington Post reports that, in 2022 alone, scams were the second most popular racket in Americawith more than 36,000 reports of people being scammed by those posing as friends and family (Federal Trade Commission data). Over 5,100 of these incidents occurred via telephone. Adding up all the successful scams, the victims lost a total of $11 million.

Recent advances in artificial intelligence have only given criminals new tools to carry out increasingly believable scams. Just think of deep fake technologies, capable of perfectly replicating voices and faces.

These scam attempts, whether artificial intelligence or not, are called impostor scams (literally imposter scam). The mechanism is for better or worse always the same: the victim, generally elderly, is contacted by a scammer who pretends to be a relative or a trusted person, asking for money because in a difficult situation. By leveraging the emotional part of the victim, the scammers push for the money to be sent as soon as possible. Thus the victim finds himself in a psychological state of alarmism, trying to help the relative, friend or lover in need without rational reasoning. Often the victims are chosen among the elderly, who tend to be less accustomed to the dangers of new technologies.

The Benjamin Perki case

Another example of a perfectly planned artificial intelligence scam is the one that saw the 39-year-old’s parents as victims Benjamin Perki. The two spouses, also in this case elderly, received a phone call from an alleged lawyer who informed them that Benjamin was in prison after killing a US diplomat in a car accident. The lawyer asserted that Benjamin needed the money immediately, to pay for legal costs.

Parents were even more alarmed when the self-styled lawyer handed the phone to Benjamin Perkin himself. Spouses found each other with the son’s voice on the other end of the phone saying he loved them and helped him financially in that dramatic situation. The voice, obviously a fake, asked $21,000 to support the first lawsuit, which was to be held that same day.

Perkin’s parents later revealed that the call had indeed sounded suspicious, but that they couldn’t shake the feeling that they actually talked to their son in prison. The voice sounded “true enough for my parents to actually believe they were talking to me,” the real Benjamin Perkin later explained. In a panic, the couple rushed to the bank and sent the amount requested by the lawyer: 21.000$ in Bitcoin. The spouses realized they had been scammed only a few hours later, when the real Benjamin telephoned his parents for a normal routine call.

How did cybercriminals steal Benjamin’s voice?

It’s unclear how the scammers got his voice of Benjamin Perkin, but it’s likely the source was a video on YouTube, platform on which the boy punctually kept a snowmobile vlog (his great passion). The family immediately reported the scam to the Canadian federal authorities, but by then it was too late, also because the payment had been made in Bitcoin.

Is it that simple to realistically simulate a human’s voice?

The answer to this question is yes.

AI-based speech generation software they analyze what makes a person’s voice unique. The technology takes into account all parameters, including age, gender and accent. Once the entry is analyzed, the software searches for similar entries in the vast database of the network. All it takes is a short sample of audio to pull off the perfect scam, which can easily be stolen from YouTube, podcasts, commercials, TikTok, Instagram or Facebook videos.

“Until a few years ago, you needed a lot of audio to clone a person’s voice,” he explains Hany Farid, professor of digital forensics at the University of California at Berkeley. “Now, if you just posted a video on TikTok with your voice, people can literally clone your voice.”

Companies like ElevenLabsan AI speech synthesis start-up founded in 2022, are able to take a short voice sample to generate a faithful copy of the voice. ElevenLabs software is free to use, although with affordable prices (from $5 to $330) it is possible to take advantage of the most advanced features. An insignificant investment for those planning to carry out a scam worth thousands of dollars.

Walker Ronnie is a tech writer who keeps you informed on the latest developments in the world of technology. With a keen interest in all things tech-related, Walker shares insights and updates on new gadgets, innovative advancements, and digital trends. Stay connected with Walker to stay ahead in the ever-evolving world of technology.