tech
December 21, 2025
‘Help! I need money. It’s an emergency’: your child’s voicemail that could be a scam
Steps to help combat fraud in which criminals use AI-generated replica of a person’s voice to deceive victims

TL;DR
- Criminals use AI to clone voices from as little as three seconds of audio.
- Scammers create fake voicemails or calls, often simulating a loved one in distress after an accident or emergency.
- The goal is to exploit emotions and urgency to prompt immediate money transfers.
- Audio snippets for cloning can be obtained from social media videos or brief phone interactions.
- Victims are urged to remain calm, think critically, and verify the authenticity of the call.
- Even caller ID can be faked; it's best to call the supposed sender back directly.
- Establishing a secret codeword with family can help verify genuine emergencies.
Continue reading the original article