Countries like the United States and Canada have already faced several such crimes; The elderly are the main target of criminals.
A 73-year-old Canadian woman was the victim of a scam, in which the criminal used an artificial intelligence-generated voice to impersonate her grandson. He called the lady, pretended to be in jail and asked for bail. This caused the elderly woman to run to a bank to withdraw large sums, without having a clue that she was being cheated.
According washington post, the lady named Ruth Card felt quite scared by the situation. She and her husband, Greg Grace, 75, quickly drove to the bank to withdraw C$3,000. However, when they tried to withdraw more money from a second account elsewhere, the director of the institution noticed this and went to talk to the couple.
The trader explained that they may have been scammed because something similar had happened to another account holder. So, after the conversation, they realized that the boy wasn’t really in jail. For the American newspaper, the elderly woman commented that she was really convinced that it was her grandson on the phone:
It wasn’t a very compelling story, but it didn’t need to be better than it was to convince us.
Artificial intelligence has become a tool for scammers
Due to the ease of use of AI programs, criminals can copy anyone’s speech from a sample of a few seconds. Typically, the technology used is deepfake, which can manipulate audio and simulate certain voices, such as timbre and pitch.
Therefore, it is difficult for the individual to consider whether he is really talking to an acquaintance or not.
Moreover, it is difficult to identify the attacker, either by the victim or by the police, because it is complicated to follow the calls made from different parts of the globe and the attackers use the voice of third parties during the call. .
In the United States, for example, this type of hoax has become the second most popular among scammers, with more than 36,000 reports in 2022 alone, according to the Federal Trade Commission. As a result, over $11 million was stolen.
Another puzzle is that Justice has no precedent for blaming companies that develop deepfake tools. This further hampers legal proceedings after an individual has been tricked by criminals over the phone.
The case of the elderly is one more on the list
Ruth Card and her husband Greg Grace were not the first and will not be the last to suffer this type of blow.
For example: an American named Benjamin Perkin, 39, watched his parents lose thousands of dollars due to a scheme using his AI-created voice. In the situation, a man called the elderly claiming he was a lawyer. Then he said the boy had killed a diplomat in a car accident and was in jail and needed bail.
The fake lawyer “put” Benjamin Perkin on the phone to talk to his mother, but it was actually deepfake-generated audio. Scared of the moment, they ended up withdrawing a large sum and depositing it in a bitcoin terminal.
When they discovered the hoax, it was too late. “The money is gone. There is no insurance. There’s no way to examine it,” Perkin said.
“Pop culture fan. Coffee expert. Bacon nerd. Infuriatingly humble communicator. Friendly gamer.”