AI tools can be used to clone the voices of people scammers find on social media. Three seconds is all that's needed to create an audio snippet and a realistic clone of the victim's voice. The cloned voice is then run through an AI program, enabling the scammer to effortlessly replicate emotions like fear or happiness over the phone.
Pretending to be the target, the scammer will call the victim’s family and demand money or gift cards. The scammer will typically try to convince the person on the other end that their loved one is in real danger, and that there are dire consequences for hanging up or failing to make a payment.
The goal of the scammer is to send the recipient of the call into fight or flight mode, giving them minimal time to think and respond logically. While no one wants to see their loved ones hurt or in danger, the best thing you can do is to hang up immediately and call the loved one directly. In future, think twice about answering a call from an unknown number, especially if they claim to be from someone you know.
AI voice scams are becoming common
A study of 7,000 people conducted by McAfee found that 77% had fallen victim to an AI voice scam, and had lost money as a result.
Deepfake images and videos
Deepfakes are manipulated images, audio, and video that are created using artificial intelligence techniques. They can allow the scammer to make a person, whether that be a celebrity or your next-door neighbour, say or do anything they please before posting it online.
Jeff Bezos, Elon Musk, and even Taylor Swift have all been victims of deepfake scams in recent years. These scams were propagated by Quantum AI, a company known for using doctored footage and artificially generated audio to prompt users into fuelling a cryptocurrency scheme.
Deepfake scams try to convince viewers that their product or platform is legitimate and worth investing in because it’s been endorsed by someone the public knows or trusts.
In online dating, deepfake images can also used for sexual extortion. If a scammer gets ahold of any of your photos, they can create datasets that blend your public photos with images or videos that contain pornography, and blackmail you in exchange for money or sensitive information.
Malicious links embedded in email and text scams usually lead to fake websites. Not too long ago, scamming people with a fake website or web store required a certain level of skill and expertise. Now, generative AI makes it possible to conduct large-scale fraud campaigns by combining coding, text, and fake images. Sophisticated scammers will also use this method in conjunction with other fake websites and social media advertisements, making it even more difficult for users to tell they’re visiting a fake, AI-generated website.
Scam emails and phishing scams
There’s nothing new about email scams and phishing – scammers have long been pretending to be government agencies and banks to get ahold of your sensitive information.
However, AI has revolutionised the way scammers go phishing. Generative AI tools like ChatGPT can help scammers easily match the tone of the government body they’re trying to impersonate, as well as correct the misspellings or grammar mistakes we typically associate with scam emails. AI tools can also distribute polished, personalised copy and emails, making it much harder for the average consumer to determine it’s a scam.
Even though bots like ChatGPT have built-in functions to prevent people from using it for nefarious purposes, these can easily be circumvented. Not unlike that phone you had in high school, generative AI bots can be jailbroken, meaning hackers can remove the guardrails of intended use to trick AI into bad behaviour.
ChatGPT's evil twins
Malicious alternatives like WormGPT and FraudGPT are popular amongst fraudsters and hackers facilitating cyber-attacks. The AI tools work similarly to ChatGPT but are used to create phishing scams, malware, and malicious code.
Online dating apps have always been rife with scams – there’s a reason the Nigerian Prince scam is so popular. However, in recent times, scammers have been taking to generative AI to make scamming lonely Australians easier.
Instead of stealing the photo of an unsuspecting man or woman online, scammers might use generative AI to produce a picture of someone who does not exist. Tools like Midjourney and DALL-E are frequently used to create images of a person a scammer is pretending to be, whether that’s an Australian redhead or a blonde young man looking for an older companion.
Even if the scammer chooses to do it the old-fashioned way (stealing a real person’s images), they may still use AI chatbots to have realistic text conversations. These bots can be trained to be likable or adopt a certain personality.
To cast as wide a net as possible, scammers will join a multitude of dating apps, using AI to create fake profiles and chat with hundreds of people at once – all without having to lift a finger. Since a lot of dating sites have built-in AI detection, these bots will try to get you off the app, and onto another one as soon as possible. Dating sites are frequently on the lookout for fake or bot accounts, and if they get caught, the scammer's account will be deleted.
Romance scams always have an end goal – whether that's to swindle you out of your life savings or to invest in fake cryptocurrency schemes. Some will even build up your trust for weeks or sometimes months, then strike and leave without a trace. While vague or unnaturally fast responses are useful ways to tell if you’re talking to a bot, one that almost always works is proposing a video call, which they’ll typically decline.
AI investment and financial scams
Online investor platforms sometimes offer a website or app with some sort of AI integration. If you’re trading on a platform that claims to use AI, make sure it's registered! Check the Investor Alert List for any unregistered investment ‘professionals’ who are known for scamming investors. Additionally, ensure the platform you’re trading with owns a current Australian Financial Services (AFS) license or an Australian credit license from ASIC.
Scammers often claim that AI can generate a sizeable profit by trading cryptocurrency on your behalf. Any platform that promises a high return with little to no risk or claims their ‘AI can pick guaranteed stock winners’ should raise some red flags.
It's not unlikely for a platform to run an investment scheme that leverages the popularity of AI to prompt people into investing. While it can seem exciting to invest in a company that claims to use AI, proceed with caution. These systems can often lure investors into schemes and use AI-related buzzwords to promise guaranteed returns.
If you think an investment platform might be using deep-faked celebrities to promote themselves, reconsider investing. If you're unsure whether or not the celebrity endorsement is legit, ask yourself why this person is endorsing this particular investment.
Don't invest in strangers
Beware of any online contacts you barely know or have never met asking you to invest. Even if the investment or cryptocurrency platform seems real, this practice, along with a promise of high-yield returns, is common in pig butchering scams.