AI Scams: When Seeing and Hearing Isn’t Believing
Artificial intelligence (AI) is now helping scammers sound and look exactly like people you know, celebrities you follow, and companies you trust. This means that even a very realistic voice call or video is no longer automatic proof that the person on the other side is real.
How scammers are using AI today
AI tools can copy someone’s voice from just a few seconds of audio—like a TikTok clip, a Facebook video, or even a voicemail—and then use that cloned voice to call you. Other tools can create “deepfake” videos that make it appear as if a real person is speaking live on camera, even if they never said those words or joined that call.
AI also helps criminals write perfect emails and chat messages with no obvious grammar mistakes, making fake bank, telco, or delivery messages look very professional and convincing. Scammers even use AI to create attractive, “too good to be true” dating profiles and investment coaches that can chat 24/7 without getting tired.
Real‑world stories
One common scam is the “family emergency” call, where someone phones you using a cloned voice that sounds exactly like your child or grandchild, saying they are in an accident or in jail and need money right now. In one case, a parent in the US lost about 25,000 US dollars (Php1.5M) after hearing what he was sure was his son’s voice begging for help.
Romance and “online friend” scams have also evolved, with scammers using AI‑edited photos and deepfake videos to pretend to be loving partners or successful investors. After months of daily chats, they slowly push their victims into fake “low‑risk” crypto or trading platforms, leading to life savings being wiped out.
At work, there are now cases where staff receive a video call that appears to show their CEO or CFO giving urgent instructions to send money or share confidential information. In one widely reported incident, a finance worker was tricked into transferring roughly 25 million US dollars because every face and voice on the video call looked and sounded like real colleagues—but they were deepfakes.
Why these new scams work so well
These scams play with emotions like fear, love, worry, and excitement, and they always add a strong sense of urgency so you feel you must act immediately. When you hear what sounds exactly like your child crying on the phone or see what looks like your boss on a video call, your brain’s “trust” instinct often kicks in before your “think carefully” instinct.
AI makes it cheap and fast for scammers to run thousands of fake conversations at the same time, each one personalized to your social media posts, interests, and background. The result is that even smart, careful people—including professionals and retirees—are falling victim, because the scams no longer look like the obvious, badly written messages of the past.
Simple habits to protect yourself and your family
You do not need to be a tech expert to fight AI‑powered scams; a few simple habits can make a big difference. Share these tips at home, at work, and with parents, grandparents, and teens.
Pause when there is pressure. Any message or call that demands money, one‑time passwords (OTPs), or gift cards “right now” is a red flag—even if the caller sounds like family or a boss.
Verify using a second channel. If someone you know asks for money or sensitive information, hang up and call them back on a number you already trust, or confirm via another app or in person.
Agree on a family “safe word.” Families can choose a private word or phrase that must be used in any real emergency call; scammers with only cloned voices will not know it.
Never send money to someone you have never met in person. Romance or “investment coach” contacts who push you to move chats to private apps and then ask for money or crypto are almost always scams.
Do not trust links in unexpected messages. Instead of clicking links in texts, emails, or chat, type the official website address yourself or use your official banking or telco app.
Limit what you share publicly. Think twice before posting clear videos of your voice, your children, your home, or travel plans, as these can feed voice‑cloning and social‑engineering attempts.
Turn on extra security. Enable multi‑factor authentication (MFA) on important accounts like email, banking, and social media, so even if a password is stolen, it is harder to break in.
Remember: trust, but always verify
The new rule online is simple: seeing and hearing is no longer enough; you must verify. If a message, call, or video makes you feel strong emotion plus urgency—whether fear, love, or excitement about money—slow down, double‑check through a trusted channel, and talk to someone you trust before you send money or share sensitive information.
