Indore Cyber Crime: 43-Year-Old School Teacher Falls Victim To AI Voice Cloning, Loses 97k

Indore Cyber Crime: 43-Year-Old School Teacher Falls Victim To AI Voice Cloning, Loses 97k

In Indore’s Lasudia area, a 43-year-old teacher was tricked by fraudsters using AI to mimic her brother’s voice. Claiming a friend’s urgent heart surgery, they convinced her to make four Google Pay transactions totaling ₹97,500. A fake “credit” SMS added to the ruse. Police warned people to verify urgent payment requests, even from familiar voices.

Staff ReporterUpdated: Thursday, January 08, 2026, 11:38 PM IST
article-image
Indore Cyber Crime: 43-Year-Old School Teacher Falls Victim To AI Voice Cloning, Loses 97k | AI

Indore (Madhya Pradesh): A 43-year-old school teacher was duped of Rs 97,500 after fraudsters used AI voice modulation to mimic her brother's voice in Lasudia area.

The incident occurred on Tuesday night when the woman received a call from an unknown number. When she answered, the voice on the other end was an exact match for her brother, whom she affectionately calls ‘Munnu Bhaiya.’ The caller claimed he was at a hospital for a friend’s urgent heart surgery. He told her that he was facing technical issues with his bank account and requested her to pay the doctor via a QR code, promising that he had already transferred the amount to her account.

To make the deception more convincing, the fraudsters sent a fake ‘credit’ SMS to the teacher’s phone. Caught in a state of panic and urgency over the supposed medical emergency, she did not verify her bank balance and proceeded to make four separate Google Pay transactions totalling Rs 97,500.

The scam came to light when she checked her account balance. When she attempted to call the number back and found it switched off, she contacted her real brother, only to discover that he had never called her. Realising she had been duped, she approached the police.

How AI voice cloning fraud worked

Initially, fraudsters take a short audio sample of a person from social media and make a clone voice using AI tools. Scammers then use this clone voice to call his relatives, impersonating his identity and faking emergencies to manipulate them into sending immediate payments. Police urged the public to be cautious about any urgent financial requests, even from familiar voices, and to always cross-verify before transferring money.