Artificial intelligence (AI) brings convenience to our lives, but imagine your voice being stolen and used as a weapon by criminals. AI voice cloning scams use your voice, or that of someone you trust, to deceive you into transferring money or revealing sensitive information. Cases in Italy and the United States demonstrate that no one is immune. This article explores how these scams occur, legal responses, and practical ways to protect yourself.

Examples of AI voice cloning scams

The Italian case: impersonating the Defence Minister

In a high-profile incident in Italy, scammers cloned the voice of Defence Minister Guido Crosetto to defraud influential business leaders. Victims included fashion designer Giorgio Armani, Patrizio Bertelli of Prada, and Massimo Moratti, former Inter Milan football club owner. Using Crosetto’s cloned voice, scammers falsely claimed Italian journalists were kidnapped and urgently needed ransom payments.

Massimo Moratti transferred almost one million euros to a fraudulent Hong Kong account, believing the Bank of Italy would reimburse him. Other targets, including Giorgio Armani and members of the Aleotti pharmaceutical family, narrowly avoided similar losses due to vigilant staff.

Milan prosecutors, citing Article 640 of the Italian Penal Code (covering fraud), are investigating this scam. They successfully traced and froze Moratti’s stolen money in the Netherlands.

The Californian case: exploiting family trust

In California, an older man named Anthony received an alarming call. Using AI-generated audio, scammers perfectly replicated his son’s voice, claiming involvement in a serious car accident. Fraudsters, posing as lawyers, urgently demanded thousands of dollars for bail.

Initially sceptical, Anthony attempted to contact his son directly but was unsuccessful. Under pressure, he made multiple cash withdrawals, eventually losing significant savings. It was only after intervention from his daughter that Anthony realised he’d fallen victim to a sophisticated AI scam.

How AI voice cloning scams work

AI voice cloning uses sophisticated algorithms that analyse minimal audio samples, sometimes as short as three seconds, to create realistic voice copies. Fraudsters quickly obtain these samples from social media posts, voicemail greetings, or brief phone conversations.

This technology has become extremely convincing, making it almost impossible for victims to distinguish between authentic and cloned voices. Regulatory bodies have recognised the difficulty in identifying these fraudulent calls, leading to stricter rules governing AI-generated robocalls.

Legal and regulatory responses to AI voice cloning scams

Italy’s legal response

Italian prosecutors are investigating these scams under Article 640 of the Italian Penal Code, which criminalises fraud. Victims such as Massimo Moratti and Minister Crosetto filed formal complaints. Authorities successfully tracked and froze stolen funds deposited in a Dutch bank account.

Responses in the United States

In the US, regulatory bodies have tightened restrictions. The Federal Communications Commission (FCC) prohibits AI-generated robocalls without explicit consent, as Article 3 of the FCC guidelines outlines.

The Federal Trade Commission (FTC) has recorded a significant increase in impersonation scams, reporting thousands of incidents. Section 1 of the FTC’s guidelines emphasises identity verification and immediate reporting of suspected fraud.

Additionally, broader laws like the EU’s General Data Protection Regulation (GDPR) protect individuals from unauthorised data collection, including voice data.

How to protect yourself and your organisation from AI voice cloning scams

Verify caller identities carefully

Always confirm unexpected requests by contacting the person directly via a trusted number. Never rely solely on caller ID, as scammers can fake this information.

Create a family or company safe word

Establish a secret safe word with family or colleagues. Use this word to verify identities during unexpected or urgent calls. Ensure this code is shared privately, not online or stored digitally.

Limit voice exposure online

Avoid posting voice recordings publicly on social media or websites, and use default, automated voicemail greetings to reduce the availability of voice data.

Report suspicious activity immediately

If you suspect a scam:

  • Immediately contact your bank to halt transactions.
  • Report the incident to local police, state authorities, and regulatory bodies.

Actions you can take next

AI voice cloning scams represent a growing, sophisticated threat. Various cases demonstrate their potential to deceive even vigilant individuals. Regulatory responses from authorities in Italy, the US, and the EU highlight the seriousness of this issue. However, individuals and businesses must remain proactive, adopting preventive measures to protect themselves. You can:

  • Reduce your vulnerability by switching to an automated voicemail greeting immediately.
  • Increase your security by establishing a safe word within your family or business today.
  • Stay ahead of fraudsters by regularly reviewing fraud prevention tips from relevant authorities, such as ICO’s advice on identity theft, and subscribing to our newsletter.