Soundalike scams: Experts voice concern over audio deepfakes

Soundalike scams: Experts voice concern over audio deepfakes

Last year, the senior official of a UK energy firm transferred $240,000 to a Hungarian company after receiving a call, purportedly from the company's CEO, ordering him to transfer funds.

Sachin GaadUpdated: Monday, August 03, 2020, 12:16 AM IST
article-image
File Photo

Cyber crimes have been on the rise amidst the lockdown and each day, scamsters seem to vary their game. Now, experts are warning about a new trend - audio deep fake - which could be used to impersonate people's voices for duping them.

Like the video deepfakes used to manipulate videos to tarnish the image of people, audio deepfakes manipulate their voices, to create an entirely new version, in order to cheat them.

Again, just like video deepfakes, the audio deepfake also uses Artificial Intelligence (AI) to clone a person's voice. Now these voice samples can be used to convince people to transfer their money where they shouldn't, according to cyber expert Ritesh Bhatia.

Such software using AI is widely available in markets for a song and can create identical voice samples of any person. While the sample created by the software is still not perfect and sounds robotic, a junior employee could easily get in trouble after receiving a direct call from his 'boss', added Bhatia.

This software is so powerful that it just requires 20-30 second voice samples of the target and the software will do the rest of the hard work. All that the fraudster has to do is, insert text in order to get the target's voice sample for the text. The more data the fraudster enters in the software, the better the quality of the deep fake audio, said a cyber expert.

In the case of executives of large firms, their audio samples can be created easily from learning calls, speeches and interviews, said a cyber expert.

In such cases, "When you receive a call from a person asking for money, even if his voice sounds familiar and there is an emergency, I would advise you to call the actual person for assurance and only then transfer the money," Bhatia recommended.

Last year, the senior official of a UK energy firm transferred $240,000 to a Hungarian company after receiving a call, purportedly from the company's CEO, ordering him to transfer funds.

Amid the pandemic, when the entire country is under lockdown to prevent the coronavirus from spreading, people have begun working as well as been shopping online. Accordingly, fraudsters have made this an opportunity to make money and are duping people with various new tricks and using audio deepfakes to extort money.

A cyber expert explained that 'deepfakes' first came into existence in 2017, after an anonymous user, “Deepfakes", posted several explicit videos of celebrities on the internet. This controversial and increasingly dangerous technique of audio and video manipulation was used to superimpose existing images or videos of a person on another source image or video. What began as a tool to make fake celebrity pornographic videos is now being deployed as a tool to spread hoaxes and generate fake news.

RECENT STORIES

Mumbai Weather Update: City Braces For Heatwave On Weekend; Mercury To Hover Around 30°C Today

Mumbai Weather Update: City Braces For Heatwave On Weekend; Mercury To Hover Around 30°C Today

Mahadev Betting App Case: Actor Sahil Khan Absconding Amid SIT Probe, Confirm Police Officials

Mahadev Betting App Case: Actor Sahil Khan Absconding Amid SIT Probe, Confirm Police Officials

FPJ Exclusive: Prestige Group Gets Gandhi Nagar Colony Redvelopment Project Without Bid

FPJ Exclusive: Prestige Group Gets Gandhi Nagar Colony Redvelopment Project Without Bid

Mumbai: Truck Overturns At Eastern Freeway, Jams Traffic For Hours

Mumbai: Truck Overturns At Eastern Freeway, Jams Traffic For Hours

Mumbai Crime: Woman Dies By Hanging Herself In Vikhroli, Husband Arrested, Family Members Booked For...

Mumbai Crime: Woman Dies By Hanging Herself In Vikhroli, Husband Arrested, Family Members Booked For...