Part 1 of 3: AI Scams and What you can do to protect yourself
- Cyndi Rose

- 5 days ago
- 4 min read

I want to tell you about a woman in Florida named Sharon.
In July 2025, Sharon Brightwell of Florida received a call from her daughter. Her daughter was crying — distraught, panicked — saying she had been in a car accident, lost her unborn child, and was facing criminal charges. She needed money immediately for a lawyer. She begged Sharon not to tell anyone.
Sharon did what any loving mother would do. She gathered $15,000 in cash and handed it to a courier who came to her home.
Only after the courier left did Sharon reach her real daughter, who had no idea any of this had happened. She was perfectly safe. She had never made that call.
The voice Sharon heard was generated by artificial intelligence. It was a clone of her daughter's voice, built from audio pulled off social media, capable of crying, pleading, and sounding completely real. Sharon had no way to know the difference.
AI can now clone a person's voice from as little as 3 seconds of audio. Scammers use TikTok videos, voicemail greetings, Facebook reels — any public recording of your family member's voice is potential source material.
This Is Not the Old Grandparent Scam
You may have heard of the grandparent scam — a caller pretends to be your grandchild in trouble, uses a generic script, and hopes you fill in the details yourself. That version still exists. But what is happening now is something fundamentally different, and you must understand why.
The new version uses AI to actually clone the voice. Not imitate it. Not approximate it. Clone it — meaning the voice on the phone sounds exactly like the specific person you love, with their accent, their speech patterns, their way of saying certain words. A McAfee survey of 7,000 people found that 70 percent could not tell the difference between a cloned voice and the real thing.
Alice Boren of Alabama answered the phone and immediately recognized her great-grandson Cameron's voice. He said he was in a car wreck, had a broken nose, was bleeding, and was being taken to jail. He said he was in pain. He begged her not to tell anyone. He promised to pay every dollar back. Then the call dropped.
A man claiming to be Cameron's attorney called next, asking for $11,000 in bail. When Alice's husband Frank said he didn't have that much, the scammer asked, 'Well, how much do you have?'
It was all a scam. Every voice. Every detail. Engineered by criminals using AI tools that anyone can access online for free.
Where They Get Your Family's Voice
This is the part that surprises most people. Scammers do not need a long recording. Research shows that 30 seconds of audio is enough to produce a convincing clone. Here is where they find it:
Facebook videos and reels — birthday tributes, graduation posts, holiday greetings
TikTok and Instagram — any video where someone speaks, even briefly
YouTube — family vlogs, milestone announcements, memorial videos
Voicemail greetings — scammers sometimes call just to capture a 'Hi, you've reached...' message
Podcast appearances or interviews posted publicly online
If your grandchildren post videos of themselves talking online, their voice is already out there. And increasingly, so is yours.
Why It Works So Well
When you hear a voice you love in obvious distress, something happens in your brain before logic has a chance to catch up. Psychologists call it an override response — the familiar voice triggers an emotional reaction so strong that critical thinking is bypassed. Scammers know this. They engineer their scripts to flood you with emotion and urgency before you can slow down and think.
Notice the pattern in every real story: there is always a crisis, always secrecy ('don't tell anyone'), and always a payment method that cannot be reversed — cash, wire transfer, gift cards. Those three elements together are the fingerprint of this scam. We will come back to those fingerprints in Part Two.
It Is Not Just Phone Calls
The same AI technology that clones voices is also being used to:
Generate fake video calls where scammers appear on screen as a family member or authority figure — fully animated, real-time, interactive
Create deepfake videos of celebrities like Elon Musk promoting fake investment opportunities — real face, real voice, entirely fabricated content
Produce phishing emails with no grammar errors and no suspicious phrasing, personalized to you because AI researched your name, your bank, and your recent activity first
A Harvard study found that AI-generated phishing emails fool over 50 percent of targets. Researchers at Reuters tested AI chatbots by asking them to write targeted phishing emails aimed at seniors. Google's Gemini helpfully added that the best time to target older adults is Monday through Friday, between 9 AM and 3 PM.
In one now-famous case, a finance employee at a global engineering firm called Arup joined a video call with what appeared to be the company's CFO and several senior executives. They instructed him to transfer $25 million to a supplier. Every face and every voice on that call was an AI deepfake. He did not find out until after the money was gone.
According to the FBI's Internet Crime Complaint Center, Americans aged 60 and older lost nearly $4.9 billion to online fraud in 2024 — a 43 percent increase from the year before. AI is a primary driver of that surge.
What This Means for You
Understanding how this technology works is the essential first step. But knowing what to actually do in the moment — when your heart is racing and the voice sounds exactly like your grandchild — that is what Part Two covers.
I will give you specific, practical techniques you can put in place right now, before a crisis, that will protect you when the call comes. And the call may well come. One in four Americans has already experienced an AI voice cloning scam or knows someone who has.
Being fooled by this does not mean you were careless. It means you are human. But you can prepare — and preparation makes all the difference.




Comments