
Have you personally experienced an AI voice / deepfake scam attempt?
What Is AI Voice Cloning?
AI voice cloning uses short audio clips — sometimes just a few seconds pulled from social media — to recreate a person’s voice.
Scammers use this technology to:
Pretend to be a grandchild in trouble
Impersonate a bank representative
Pose as a government agency
Claim to be from Medicare or Social Security
Demand urgent money transfers
The Federal Trade Commission has issued warnings about voice cloning being used in “family emergency” scams, where seniors are pressured to send money immediately.

What Is a Deepfake?
A deepfake is a video or audio recording that uses AI to realistically imitate someone’s face or voice.
It can make it appear that:
A family member is asking for help
A financial advisor is recommending an investment
A public official is endorsing a program
A company executive is requesting payment
The Federal Bureau of Investigation has reported increasing cases of AI-generated impersonation scams targeting older adults.
Why Seniors Are Being Targeted
Scammers target older adults because:
Many have savings or retirement accounts
They are more likely to answer phone calls
They value family urgency
They are polite and trusting
But this isn’t about being naïve. It’s about criminals exploiting emotion and technology.
The #1 Red Flag: Urgency and Secrecy
Almost every AI scam includes two elements:
“Don’t tell anyone.”
“You must act immediately.”
If someone pressures you to:
Wire money
Send gift cards
Transfer cryptocurrency
Share account information
Provide verification codes
Pause immediately.
The AARP Fraud Watch Network emphasizes that real institutions do not demand secrecy or immediate payment through gift cards or wire transfers.

How to Spot AI Voice Scams Quickly
Here are fast checks you can use:
1. Ask a Personal Question Only They Would Know
If someone claims to be your grandchild, ask:
“What was the name of our last vacation spot?”
“What’s the nickname I call you?”
AI can mimic a voice — but it cannot answer spontaneous, personal questions accurately.
2. Hang Up and Call Back
Do not rely on caller ID.
Hang up. Call the family member directly using a number you already have saved.
If it’s real, they’ll answer. If it’s a scam, the panic disappears.
3. Listen for Slight Audio Glitches
AI voices sometimes:
Sound slightly robotic
Pause unnaturally
Lack background noise
Overemphasize urgency
Trust your instincts.
Deepfake Video Warning Signs
If you see a video asking for money:
Check for odd blinking patterns
Look for unnatural facial movements
Notice mismatched lighting or shadows
Be cautious of urgent “investment opportunities”
The Federal Trade Commission warns that scammers often combine deepfake video with fake websites to appear legitimate.

Protect Yourself Before a Scam Happens
Here are smart preventative steps:
Create a family “safe word”
Set social media accounts to private
Avoid posting videos publicly
Limit public sharing of birthdates and addresses
Register your phone on the National Do Not Call Registry
Use call-blocking services
Freeze your credit if you’re not applying for loans
The Federal Communications Commission provides tools and guidance on blocking robocalls and spoofed numbers.
If You Think You’ve Been Targeted
Act quickly:
Stop all communication
Contact your bank immediately
Report to the Federal Trade Commission at IdentityTheft.gov
Notify family members
The faster you act, the more likely funds can be stopped or recovered.
One Important Reminder
AI is powerful, but it still depends on one thing:
You are reacting emotionally.
Slow down. Verify. Call back. Ask questions.
No legitimate organization will ever be offended that you double-check.
Technology will keep evolvin, but so can your awareness.
You don’t need to understand artificial intelligence in detail.
You just need to recognize the patterns:
Urgency. Secrecy. Pressure.
Once you see those clearly, the scam loses its power.
And that’s the kind of protection that works at any age.
With care,
Mike Bridges
Founder, The O55 Report