" "

McAfee Report Warns of Rising AI Voice Scams in India

A new report by McAfee highlights the growing threat of fraudulent activities carried out through the use of AI-generated voice technology in India.

author-image
Kashish Haswani
New Update
mcafee

McAfee HQ

McAfee Corporation, a global leader in online protection, published a report on May 2, 2023, highlighting the rise of artificial intelligence (AI) technology fueling online voice scams. 

According to the report titled "The Artificial Imposter," just three seconds of audio is enough for cybercriminals to clone a person's voice, and with the rising popularity and adoption of AI tools, it is easier than ever to manipulate voices. McAfee's survey conducted with 7,054 people across seven countries, including India, revealed that more than half (69%) of Indian adults think they don't know or cannot differentiate between an AI voice and a real voice.

The report further revealed that 47% of Indian adults have experienced or know someone who has experienced some kind of AI voice scam, which is almost double the global average of 25%.

The study found that 83% of Indian victims suffered a loss of money, with 48% losing over Rs 50,000.

Advertisment

Scammers use AI technology to clone voices and then send fake voicemails or voice notes or call the victim's contacts, pretending to be in distress. With many Indians not confident that they can identify the cloned version from the real one, this technique is gaining momentum.

The survey found that more than half (66%) of the Indian respondents would reply to a voicemail or voice note from a friend or loved one in need of money. Messages that claim that the sender has been robbed, involved in a car incident, lost their phone or wallet, or needed help while travelling abroad were most likely to elicit a response.

The report warns that AI voice-cloning technology requires limited expertise and just seconds of audio, and more than a dozen freely available tools are accessible on the internet. With the accuracy of clones increasing, cybercriminals easily exploit the emotional vulnerabilities inherent in close relationships and dupe victims into handing over their money.

McAfee CTO Steve Grobman said, "Advanced artificial intelligence tools are changing the game for cybercriminals. It's important to remain vigilant and to take proactive steps to keep you and your loved ones safe."

He recommends setting a verbal 'codeword' with kids, family members, or trusted close friends that only they would know and always asking for it if they call, text, or email to ask for help. Identity and privacy protection services can also limit the digital footprint of personal information that criminals can use to create a compelling narrative when creating a voice clone.

Also Read:

 

Subscribe