Audio deepfakes, i.e. voices imitated with computer systems, are constantly evolving. This results in an increase in AI-supported fraud attempts.
The so-called "grandchild trick", in which the perpetrator usually pretends to be the victim's child or grandchild and pretends a crisis situation in order to obtain financial support, is already well known. Now, however, artificial intelligence is used that specializes in imitating voices. These AIs sometimes only need 30 minutes of audio material and can use it to create a complete voice profile with which this person can be impersonated. With the use of generative AI, this works alarmingly well. Getting the audio you need is now easier than ever, too. Social media makes it possible.
Imitation voice is the same as the original
There are already many reports, even before the availability of generative AI, of such fraud attempts in the corporate environment, as evidenced by an incident in the United Arab Emirates in 2021. And many also fall for it because a large part is not able to distinguish the imitated voice from its original. The more material the AI has at its disposal, the more difficult it becomes to detect abnormalities in the imitated voice.
McAfee's Beware the Artificial Impostor report found that 25 percent of respondents worldwide said they had received or knew someone who had received a scam call using a sound-like AI voice. With online services like those from Eleven Labs offering instant voice cloning to create a synthetic voice from 30 minutes of audio samples, it's only a matter of time before threat actors exploit AI voice-based fraud attempts even more. According to the report, nearly half (48%) of those surveyed would help if they received a call about a car accident, 47 percent would help if they were called about a theft, 43 percent if they lost their wallet, and 41 percent if they needed help while on vacation.
Few can tell the difference
70 percent also said they couldn't tell if the voice was real or not. So the only real clue as to whether a call is a scam or not is the fact that the call itself is unexpected. The potential uses for this type of fraud in the business world range from CEO gift card fraud to digital scams and more – all of which require users to continuously participate in security awareness training within the organization so that they remain vigilant, even when the voice on the other end of the phone sounds familiar.
More at KnowBe4.com
About KnowBe4 KnowBe4, provider of the world's largest platform for security awareness training and simulated phishing, is used by more than 60.000 companies around the world. Founded by IT and data security specialist Stu Sjouwerman, KnowBe4 helps organizations address the human element of security by raising awareness of ransomware, CEO fraud and other social engineering tactics through a new approach to security education. Kevin Mitnick, an internationally recognized cybersecurity specialist and KnowBe4's Chief Hacking Officer, helped develop the KnowBe4 training based on his well-documented social engineering tactics. Tens of thousands of organizations rely on KnowBe4 to mobilize their end users as the last line of defense.
Matching articles on the topic