Friday, December 8, 2023
HomeSECURITYHow a deepfake in the hands of criminals becomes a real weapon...

How a deepfake in the hands of criminals becomes a real weapon against humanity

-


How a deepfake in the hands of criminals becomes a real weapon against humanity

With the advent of neural networks, scammers have learned to use them for crime and extortion.

Earlier this year, there was an incident in which a cybercriminal tried to extort $1 million from an Arizona woman, claiming he had kidnapped her daughter. For greater credibility, the offender threatened to beat the girl, while the hostage’s mother, Jennifer DeStefano, heard screams, crying and desperate pleas for her daughter’s help on the phone.

However, these cries turned out to be fakes. DeStefano told police that she was certain the man on the phone had indeed kidnapped her daughter because the alleged kidnapped victim’s voice was identical to her daughter’s.

This is one of a rapidly growing number of cases where cybercriminals have used AI tools to trick people. The problem has become so acute that In early June, the FBI issued a warning which reported on numerous complaints about the increasing sextortion – attackers publish plausible pornographic content online, including with the participation of underage children, created using machine learning technologies, and threaten to distribute potential victims among friends and acquaintances for extortion purposes.

In many cases, all the attackers need to create deepfakes are small samples of the target’s content. Even a few seconds of an audio recording that a person can post on social media is enough to clone the victim’s voice. According to Trend Micro researchers, many AI tools are now available that make it easy to clone a voice using voice samples collected from various sources.

Back in May, cybersecurity experts Recorded Future warned about the growing interest of intruders to voice cloning services on the Internet (Voice Cloning-as-a-Service, VCaaS) that make it easier to cheat using deepfake technology. Increasingly, off-the-shelf voice cloning platforms are appearing on the darknet, according to Recorded Future, lowering the entry barrier for cybercriminals. Some of them are free when you sign up for an account, while others cost only about $5 per month.

Also this year it became known that new neural network VALL-E from Microsoft Corporation able to fake the voice of a particular person up to intonations. Reportedly, the received sample VALL-E breaks into the smallest fragments and compares them with the existing database. Having information about how the voices of other people sound in different situations, the neural network “assumes” how the voice of the “donor” will sound in the same situations.

In addition to voice forgery for crimes, there is also voice authentication – another thing to be afraid of. Recently, computer scientists from the University of Waterloo developed an attack method which can successfully bypass voice authentication security systems with a success rate of up to 99% after six attempts.



Source link

www.securitylab.ru

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular