top of page
Search

Beware the New Deepfake Scams: How to Protect Your Family in 2026

Scammers have taken their tricks to a new level in 2026. Instead of just calling, they now use live deepfake video calls on platforms like Zoom, FaceTime, and WhatsApp. These calls show a “family member” on screen who looks real, moves naturally, and even reacts to your questions. The goal is to trick you into believing a fake emergency, like an accident or arrest, to steal money or information.


This new scam is especially dangerous for seniors. The video feels so real that it’s hard to doubt what you see. But there are ways to protect yourself and your loved ones. This post explains how these scams work and what you can do to stay safe.



How Live Deepfake Video Calls Work


Deepfake technology uses artificial intelligence to create realistic videos of people who aren’t really there. The new twist is that scammers use AI that works in real time. This means the fake person on the screen can:


  • Move their head and eyes naturally

  • React to questions or comments

  • Show emotions like fear or sadness


For example, a scammer might pretend to be your son or daughter. They might say they are in trouble and need money urgently. The AI can even adapt if you ask questions, making it harder to spot the scam.


One viral case in April 2026 showed a scammer exposed only when someone asked them to hold up three fingers. The AI glitched and failed to do it correctly. But most people don’t think to test the video like this.


Why Seniors Are the Most Vulnerable


Seniors often rely on video calls to stay connected with family. When they see a familiar face on screen, they trust it immediately. The realistic video makes it feel impossible to fake.


Scammers target seniors because:


  • They may not be familiar with deepfake technology

  • They want to help family in emergencies without hesitation

  • They might not know how to verify the call


This makes it critical to educate older family members about these scams and how to respond.


Simple Ways to Protect Your Family


You can take practical steps to avoid falling for deepfake video scams. Share these tips with your loved ones:


  • Never accept unexpected video calls from family members. If you weren’t expecting a call, don’t answer it right away.

  • Use the three-finger test. Ask the caller to hold up three fingers. Deepfake AI often struggles with specific, unusual gestures.

  • Ask questions only your real family would know. This could be a shared memory or a unique detail.

  • Hang up and call back on a saved number. If the call seems suspicious, end it and call your family member using a number you already have saved.

  • Tell your parents it’s okay to verify first. Let them know you understand if they want to check before trusting a call.


These simple actions can stop scammers before they get a chance.



What to Do If You Suspect a Deepfake Call


If you or someone you know receives a suspicious video call, take these steps:


  1. Stay calm and don’t rush to act. Scammers want you to panic.

  2. Politely ask the caller to do something unusual, like the three-finger test.

  3. End the call if anything feels off.

  4. Contact the family member directly using a trusted phone number.

  5. Report the scam to local authorities or consumer protection agencies.


Sharing your experience can help others avoid the same trap.


Staying One Step Ahead


Deepfake technology will keep improving, but your calm pause and careful checks are stronger than any AI trick. Educate your family, especially seniors, about these scams. Encourage open conversations about unexpected calls and video chats.


Have you or someone you know received a suspicious video call recently? Share your story without personal details. Together, we can learn and protect each other from these new scams.


Remember, the best defense is awareness and verification. Don’t let scammers use technology to break your trust.



 
 
 

Comments


CyberIn60 shield logo for senior online safety

Connect with Us Today

CyberGuide

home of Cyber in 60... Visit us on Youtube for insights and tips!

  • X
  • Youtube
  • Facebook

⚠️ Disclaimer: AI-generated reports are for educational purposes only and does not constitute legal, financial, or professional advice. Always verify suspicious messages directly with the organization involved.

Sentinel Risk Advisory, LLC

seniorcyberguide@gmail.com

(321) 233-3488‬

© 2026 by CyberGuide by Wix 

 

bottom of page