Deepfake in audio becomes a mechanism to inflict blows on family and friends | CT Detective

Over time, cybercriminals eventually adapt to new emerging technologies to create more personalized and effective scams that are more successful in stealing data and money from victims.

One such resource being explored in recent times is the so-called deepfake. The AI ​​resource has been used to simulate people’s audio to ask for money and cheat family and friends. O Detective TudoCelular explains the situation and how to protect yourself afterwards.

recent cases



One of the recent cases involved a 73-year-old Canadian named Ruth Card. In March this year, the criminal impersonated the lady’s grandson, using an AI-generated voice. He called his wife, told her he was under arrest and needed bail.

The elderly woman believed and ran with her husband – Greg Grace, 75 – to withdraw a large sum from the bank – to be exact, C$3,000. While trying to withdraw a larger amount at another financial institution, the manager alerted and warned the couple that it might be a scam, having seen a similar case from another account holder.

Blow also in Brazil

But this is not a practice reserved for foreigners. The blow was also applied in Brazilian territory. The site’s Instagram profile SOS Almanac stated a case that occurred, in which the man received a call with the voice of his son, asking for 600 BRL to supposedly pay a bill.

However, when transferring the amount, the money went to the account of a person with another name – which caused a strangeness in the man.

What deepfake?



O deepfake is the name given to a technology that uses artificial intelligence with deep learning techniques to copy a person’s speech or gaze, through content of just a few seconds.

The result can be recorded in video, image or audio, all with ultra-realistic quality. These contents are capable of adapting any speech to the voice of another person who has never spoken it, which generates the ability to deceive third parties.

As the practice has become increasingly simple, thanks to AI software, fraud detection is often complex. This is because the voice used is not that of the cybercriminals, in addition to call tracking challenges.



Deepface versus deepfake: to





security
Dec 22




What





security
08 Oct


It didn’t start from now

Although gaining momentum in 2024, the deceptive practice is not new to cybercriminals. In mid-2020, a company employee – who declined to be identified at the time – reportedly received an audio message from the company’s CEO.

In the voicemail, the recording asked “immediate assistance to finalize an urgent matter”. The worker became suspicious and contacted safety consultancy firm Nisos – who discovered it was synthetic sound.

But it’s not just a voice-related problem. At wrong wrong they can also be made in the image of the exploited person, mainly for biometric authentication in banks.

According to a study carried out by iProov and released last week, this method has become one of the main attacks to gain unauthorized access to banking applications. There is also an estimate of a 20% loss in online banking revenue – something around 60 billion reais a year – in Latin America due to fraud – with Brazil in first place.

How to protect yourself?



To prevent someone pretending to be a relative and deceiving you, there are a few tricks that will allow you to detect the origin of the voice message. One of them is to video call the person – preferably covering the camera so you can’t be seen before checking who’s on the other side.

Another possibility is to create a “keyword” between you and your family or close friends. So, if in doubt about who is on the other side, just ask the person to say the agreed-upon word, to see if there is any truth to the voicemail.

Have you encountered such a scam? How did you manage to identify the fraud or how did you proceed in the case? Tell us in the space below.

Julia Fleming

"Prone to fits of apathy. Beer evangelist. Incurable coffeeaholic. Internet expert."

Leave a Reply

Your email address will not be published. Required fields are marked *