Audio Deepfake Scams Cost Companies Millions

In 2023, imposter scams cost the U.S. $2.7 billion, with scammers posing as government officials, bank fraud agents, tech support, or even distraught friends or family members. AI is making these scams even more convincing, especially with the rise of audio deepfake scams, also known as voice cloning.

It may seem harmless, but just a few seconds of audio from a social media post or even a voice message can be used by cybercriminals to create a believable clone that can be manipulated to suit their needs. And research from McAfee found that 53% of adults share their voice online at least once a week. 

Scammers use audio deepfakes to target businesses as well, often with significant financial losses. These type scams take the more commonplace phishing scam to a new level.How do they work? Typically an employee receives a phone call from someone purporting to be their boss, urging them to immediately transfer money to a fraudulent vendor, client, or even the boss himself. 

This type of scam has already cost businesses millions. One of the most publicized scams in 2023 involved a finance worker in Hong Kong who was tricked into paying out $25 million after a video call with a deepfaked chief financial officer from his company. And this March criminals used artificial intelligence-based software to impersonate the voice of a CEO at an energy firm and demand a fraudulent transfer of $243,000. 

AI is also being used to prevent scams, with an increasing amount of companies employing AI to detect deepfakes and protect sensitive information. In fact, deepfake start-ups are on the rise, including companies that focus on AI detection technologies.

As AI becomes increasingly adept at mimicking human voices, concerns are mounting about its potential to harm people by, for example, helping criminals access their bank and other private financial accounts, and spread misinformation.

Earlier this year, OpenAI, the maker of generative AI chatbot ChatGPT, unveiled its voice replication tool, Voice Engine, but didn’t make it available to the public at that stage, citing the “potential for synthetic voice misuse.”

As AI scams become more prevalent, it’s important to proceed with caution if you are contacted with an “urgent” request to send money or other private financial information, even if it seems to come from someone familiar. Always verify by contacting the person or organization directly. 

Tips to Avoid Scams:

Be cautious with requests for personal info.

Beware of urgency in communications.

Don’t trust offers that seem too good to be true.

Take a moment before responding to any unexpected demands.

Contact companies or loved ones directly using known phone numbers.

Starling Bank, an online-only lender based in the UK, is encouraging people to agree a “safe phrase” with their loved ones — a simple, random phrase that’s easy to remember and different from their other passwords — that can be used to verify their identity over the phone. But caution should be taken in sharing the “safe phrase” digitally, such as via text message.

The FTC also recommends reporting any suspicious activity.

Exit mobile version