Mary Schat, a mother from Grand Rapids, Michigan, narrowly avoided losing $50,000 after falling victim to a new AI-based scam. On a typical Sunday morning, Schat received an unexpected call from an unknown number in Holland. Concerned for her daughter, a student at Hope College in Holland, she answered.
To her shock, Schat heard what she believed to be her daughter’s terrified, mumbled voice on the other end. “I heard, ‘They’re taking me. They’re taking me,’” she recalled, deeply shaken. The call was then taken over by a man claiming to be with a Mexican cartel, who informed her that they had her daughter in custody following a car crash and demanded a $50,000 ransom. A meeting at a nearby hardware store was set for the exchange.
“When it’s your own daughter, you’re ready to do whatever it takes,” Schat said, describing her panic as she prepared to find cash and keys. Meanwhile, her husband contacted local authorities, who quickly suspected a scam and advised them to call their daughter directly. They soon confirmed she was safe at her apartment.
Reflecting on the experience, Schat was alarmed by how convincing the call had been. “It was definitely her voice. A mother knows her daughter’s voice and her daughter’s crying,” she said, astonished by how her daughter’s voice could be cloned so easily.
According to the Better Business Bureau (BBB), this scam tactic—known as voice cloning—uses AI to replicate voices from small audio samples, often sourced from social media posts. The BBB advises people to resist acting immediately in such situations. If in doubt, they recommend contacting the supposed victim directly and avoiding sending money.