
Rahul Telang,
Carnegie Mellon University
Scams are nothing new – fraud has existed as long as human greed. What changes are the tools.
Scammers thrive on exploiting vulnerable, uninformed users, and they adapt to whatever technologies or trends dominate the moment. In 2025, that means AI, cryptocurrencies and stolen personal data are their weapons of choice. Duty, fear and hope still provide the entry points: following instructions from bosses or co-workers, worrying about a loved one, or chasing a seemingly great investment or job.
Artificial intelligence is cheap, accessible and disturbingly effective. Businesses use AI for advertising and customer service, but scammers exploit it to mimic reality with alarming precision. Criminals use AI-generated audio and video to impersonate CEOs, managers or family members in distress. Employees have been tricked into transferring money or leaking sensitive data. Over 105,000 such deepfake attacks were recorded in the U.S. in 2024, costing more than US$200 million in the first quarter of 2025 alone. Victims often cannot distinguish synthetic voices or faces from real ones.
Scammers also manipulate emotions, phoning or texting victims while posing as relatives in urgent trouble. Elderly people are especially vulnerable when they believe a grandchild is in danger. The Federal Trade Commission has warned that scammers regularly create fake emergencies to exploit family bonds.
Crypto remains the Wild West of finance, fast and largely unregulated. Pump-and-dump schemes lure investors with hype before collapsing. “Pig butchering” blends romance scams with crypto fraud, as scammers build trust for weeks before persuading victims to invest in fake platforms, then disappear with the money. Fraudsters also use crypto as payment in impersonation scams, often directing victims to bitcoin ATMs to deposit cash that becomes untraceable cryptocurrency.
Old scams haven’t vanished – they’ve evolved. Phishing and smishing trick victims into clicking links in emails or texts, leading to malware, credential theft or ransomware. AI has made these lures even more convincing, mimicking corporate tone and design. Tech support scams begin with alarming pop-ups or cold calls urging victims to call a number. Once on the phone, victims are persuaded to allow remote access to their computers, giving scammers the chance to install malware, steal data or demand payment.
Fake websites and listings also proliferate. Fraudulent sites impersonate universities or ticket sellers, tricking people into paying for fake admissions, concerts or goods. One case involved a fake “Southeastern Michigan University” copying the website of Eastern Michigan University to defraud victims.
The rise of remote and gig work has created new avenues. Scammers post fake jobs promising high pay and flexible hours but instead extract “placement fees” or harvest personal data such as Social Security numbers and bank details for identity theft.
Technology may have changed, but the principles of protection remain familiar. Never click suspicious links or download attachments from unknown senders. Enter personal information only on legitimate websites. Enable two-factor authentication to protect against stolen passwords, and keep software updated to patch security holes. A legitimate business will never ask for personal data or money transfers – such requests are red flags.
Technology has supercharged age-old fraud. AI makes deception nearly indistinguishable from reality, crypto enables anonymous theft, and remote work widens opportunities to trick people. The constant is that scammers prey on trust, urgency and ignorance. Awareness and skepticism remain the best defense.
[Abridged]






No Comments