“Ask and ye shall receive” is a famous saying in the Christian world. When those words were written in the New Testament, it wasn’t anticipated Satan might use it to fool the gullible masses. Yet in the AI era, Satan has emerged in the form of fraudsters using generative AI. It’s never been so easy. Just create a Deepfake audio of anyone and call up their contacts for money/favours or passwords. If you are intelligent enough to fake the right individual, you will get it easily most of the time. The Gospel truth has never been easier and truer for cybercriminals.

When Voices Deceive: How Deepfake Audios Exploit Trust and Create Cyber Vulnerabilities

/
/
deepfake audio image x

Introduction

Ask and ye shall receive” is a famous saying in the Christian world.

 

When those words were written in the New Testament, it wasn’t anticipated Satan might use it to fool the gullible masses. Yet in the AI era, Satan has emerged in the form of fraudsters using generative AI. It’s never been so easy. Just create a Deepfake audio of anyone and call up their contacts for money/favours or passwords. If you are intelligent enough to fake the right individual, you will get it easily most of the time. The Gospel truth has never been easier and truer for cybercriminals.

Methodology: How does the victim experience it?

One morning, you receive a call from an unknown number. It’s your boss. “Good morning, I’m calling from my personal number because the office phone is charging. Save this alternate number.”

 

You reply “Yes Boss”.

 

The Boss hurries “We need to remit money to a supplier by 9.30 so that he immediately despatches the raw materials. It will take 4 days for delivery from their Chennai Godown. Can’t wait any longer. I will do it myself. Do you remember the transaction password? “

 

You reply “It’s written in the Diary inside my locked drawer in case you are in office. I had made some remittances yesterday and remember the password. It’s Xxxxxxxx. I am on my way and will reach the office in about 15 minutes”.

 

The Boss continues, “Okay. I will make the down payment now. Come to my cabin once you reach the office. We have some more payments to make”.

 

The victim hangs up after saying Okay. After reaching the office, he knocks and walks into the cabin of his boss. After initial greetings, he mentions the conversation to which Boss seems confused. The victim goes on with some details and his Boss doesn’t seem to have the slightest idea. They try the number from which the alleged call came and it’s switched off. The victim and his boss are now in panic mode after realizing that the password has been given away. Hurriedly, they login to the internet banking account of the company. They find the CC limit exceeded. Transactions to the tune of crores are already done and the cyber fraud has struck successfully. It was just “asking” that did it. No force, no threat, nor complex hacking was required. An Accountant merely obeyed the voice of his Boss.

Background: How the fraudster executes it?

While the victim got fooled in a single step of being asked, the fraudster required some background knowledge which he acquired from the safety of his laptop.

 

After finding the LinkedIn profile of the victim, he concluded that he is an Accountant in charge of XYZ Ltd. It wasn’t difficult to find his Boss because the Boss had wrote glowingly about him in the LinkedIn recommendation. The opportunity was out there in the open platform. The only input that fraudster needed was a 10-20 seconds audio clip of the Boss. LinkedIn, Facebook and Twitter were generous with it because the boss had given a speech in the office Diwali celebration. Facebook Live had a full 9-minute video of the Boss.

 

The fraudster neatly downloads the videos into folder and makes an audio clip out of it. The audio will be uploaded to a free AI tool that will speak to the victim next morning.

 

As expected by the fraudster, the victim hadn’t reached the office. The conversation went smoothly and nobody suspected that it was a pre-recorded audio over the phone.

 

Once the password was given away, it took only seconds for the criminal to transfer all the bank balance of the company to a Benami account. From there, he will convert it into crypto currencies and will store it in the Deep dark web until he needs it.

Conclusion

Generative AI can be used effective cybercrimes because deepfake audios and videos ride on the goodwill of real individuals in the real world. It taps into the social engineering that prompts an individual to lower his guard when speaking with friends, family or office colleagues. It heightens the risk of victim because even cyber experts during the course of the day don’t naturally think of the possibility of fraud. By the time they realize that their auto response was wrong, the damage has been done. Deepfakes are dangerous precisely because they look so authentic that you don’t think twice.

detection tool x

WEBSITE: www.pi-labs.ai

E-MAIL: hello@pi-labs.ai
Together, let’s build a digital future where we can differentiate fake from real.

When Voices Deceive: How Deepfake Audios Exploit Trust and Create Cyber Vulnerabilities

Schedule a Demo