top of page

AI-Powered Voice Scams Surge 19% as Cloning Tools Require Just Seconds of Audio

Imposter scams powered by artificial intelligence are growing more convincing and more common, with cases jumping roughly 19% to approximately 1 million complaints in 2025 and total losses climbing to more than $3.5 billion, according to the Federal Trade Commission.

 

The mechanics behind the calls have changed significantly. Voice-cloning tools can now generate a synthetic replica of someone's voice from audio samples as short as three seconds, said Michael Bruemmer, vice president of global data breach and consumer protection at Experian. Those synthetic voices are then paired with spoofed caller ID and personal details — names, workplaces, family relationships — to create calls that feel immediate and specific.

 

Kris Sampson, a resident of Missoula, Montana, experienced the tactic firsthand. She was working from home when her phone displayed what appeared to be a call from her adult daughter, complete with the daughter's name, photo, and familiar ringtone.

 

"It was her voice, I know her scared cry," Sampson told CNBC Make It. "I thought maybe she'd been in a car wreck."

 

A man then came on the line, speaking calmly at first, using Sampson's first name and asking whether she was her daughter's mother. His tone shifted to threats and demands for money through PayPal, warning her not to contact police. Sampson's sister called 911 during the ordeal while Sampson used pauses between calls to contact family members and her daughter's workplace in Helena, Montana, roughly two hours away.

 

About 15 to 20 minutes after the first call, her daughter was located at her workplace after briefly stepping away from her desk. The calls stopped. The caller was never identified.

 

"It was the most afraid I've ever experienced in my life," Sampson said.

 

Officer Whitney Bennett, a spokesperson for the Missoula Police Department, said the department has received reports of similar scams. "What has evolved in recent years is the level of sophistication," Bennett said, noting that detectives told Sampson there was little police could do because the calls were difficult to trace.

 

The scale of the fraud ecosystem has also changed. Ian Bednowitz, general manager of identity and privacy at LifeLock, described fraud as becoming "industrialized," with organized networks running coordinated operations across borders — many based in Asia and Africa — and operating like businesses, with workers handling calls, scripts, and outreach at scale. More than 75% of cybercrime now stems from scams and social engineering tactics, according to Bednowitz's testimony before a House Financial Services subcommittee in September 2025.

 

Losses tied specifically to social media scams have increased eightfold since 2020, reaching approximately $2.1 billion in 2025, the FTC reported.

 

Research published in 2025 by Rutgers University researcher Sanket Badhe demonstrated an AI system capable of conducting scam phone calls end to end, autonomously, with no human in the interaction loop. Badhe noted that cost, performance, and latency still constrain wide deployment of large language model technology in scams, but added that "as the performance of smaller, faster models continues to improve, this will become an imminent threat."

 

Scammers typically require only limited information to make a call feel convincing. Short clips pulled from social media, voicemails, or other recordings are sufficient to generate a synthetic voice, Bednowitz said.

 

Security experts recommend several countermeasures. Bruemmer advises what he calls "JDA — just don't answer the phone" for unknown or unexpected calls. If a caller claims to be a distressed family member, he suggests hanging up and reaching that person through a separate number, a workplace, or a trusted contact. Families can also establish a code word to quickly verify whether an emergency is real.

 

Bruemmer added that limiting social media exposure — particularly long recordings where a voice can be sampled — reduces the raw material available to scammers.

 

For Sampson, the episode produced lasting behavioral changes. She became more cautious at home, updated her phone settings, and remains wary of calls even from familiar numbers. "I don't ever want to hear that ringtone again," she said.

 

bottom of page