Generative AI is fueling sophisticated impersonation scams

As developers rush to release their AI products, not all of them are implementing checks to prevent misuse of their services. Scammers have made news with their use of voice cloning software, in particular.

With generative AI, criminals can easily obtain scripts for a variety of scenarios and create a convincing human voice. The software can help fraudsters choose words more common in a target region, impersonate an accent specific to a location, or mimic a loved one’s voice. All together, these tools help create more believable impersonation schemes, like those imitating government officials or family members.

Thanks to AI, impersonation scams can take a variety of forms that can become even harder to detect. A fraudster might pretend to be a loved one in distress, requesting money for bail or a hospital bill. They sometimes mimic government officials, urging their target to wire money out of a bank account and into a “safer” location, like a criminal’s wallet. Scammers can impersonate your boss, asking you to provide personal information or purchase gift cards on the employer’s behalf.

The Federal Trade Commission consistently finds impersonation scams to be the most reported type of fraud to the agency. In 2024, the FTC received nearly 850,000 complaints about imposter scams. In 2023, nearly half of the fraud reports to the agency were about impersonation scams. The agency did select four winners for its “Voice Cloning Challenge,” with participants submitting tools to help detect inauthentic audio.

To help protect yourself from scammers using generative AI, keep the following tips in mind:

  • Double-check before acting. If someone calls or messages you with a request, verify that it is actually from them. For example, if you receive a phone call, send the person a text message to confirm.
  • Get a second opinion. Ask a trusted individual if the situation seems authentic.
  • Minimize the amount of publicly available content, including your voice. While not always possible, making it harder for bad actors to obtain a sample of your voice for cloning can limit their options. For example, you might avoid using a custom voicemail greeting.

Reporting Scams

If you suspect you’ve encountered a scam:

  • File a complaint at Fraud.org: We will share your complaint with our network of consumer protection agencies and law enforcement partners.
  • Report it to the FBI: Use the gov website to report directly to the agency.
  • Inform the FTC: File a complaint at ReportFraud.ftc.gov.

Special Invitation: Join Fraud.org for a Groundbreaking Conference on AI and Consumer Fraud

“Artificial Intelligence and Consumer Fraud: Risks, Responses, and Policy Solutions”
September 17, 2025
Washington, DC
FREE

Fraud.org in partnership with the George Washington University Competition & Innovation Lab, invites you to a live event exploring the rising threat of AI-generated fraud. Special guest Kara Swisher, renowned tech journalist and commentator, will join us for a conversation on how we can stay ahead of the curve in a fast-evolving threat landscape.

This free conference will bring together policymakers, technologists, consumer advocates, and fraud prevention experts to examine:

  • How organized criminals are leveraging AI to supercharge scams
  • Tools and technologies that can detect and prevent AI-powered fraud
  • Public policy solutions to protect consumers in the age of artificial intelligence

Click here to learn more and register