Cloning a human voice has never been easier, thanks to advancements in AI technology. New tools can capture a voice sample, process it, and replicate it with uncanny accuracy. Although voice cloning has been around since 2018, today’s AI tools can perform the task faster, more precisely, and with minimal effort.
If you’re skeptical, consider this: OpenAI, the company behind ChatGPT, showcased a project this year that demonstrated how a voice could be cloned using just a 15-second recording.
While OpenAI’s tool isn’t publicly available and reportedly has security measures to prevent misuse, other services, like Eleven Labs, offer similar capabilities. For just $6, anyone can clone a voice using a one-minute audio sample, and this service is accessible to the general public.
The potential for abuse is alarming. Scammers can exploit phone calls or videos posted on social media to gather voice samples from unsuspecting individuals, then use these cloned voices in fraudulent schemes.
One particularly troubling example is the “grandparent scam,” where a fraudster clones the voice of a grandchild and calls an elderly person, urgently requesting money under the guise of an emergency.
Common scenarios include the “grandchild” claiming to have been in an accident or arrested, pleading for the grandparent to keep the call a secret—especially from their parents—to avoid exposing the fraud. The emotional manipulation is powerful, and many fall victim to it daily.
Even if you’re aware of voice cloning technology, you might still be deceived, especially in a moment of panic. The scam’s effectiveness lies in its ability to prey on emotions, making it difficult to question the legitimacy of the call, even if something feels off.
As AI-driven voice cloning technology becomes more prevalent, this type of scam is likely to become even more widespread.
To protect yourself, have a family discussion and establish a unique code word to be used in emergencies. If you receive a distress call from a family member asking for money, ask them to confirm the code word. If they can’t, it’s likely a scammer trying to deceive you.