With the most recent AI boom, scams that rely on this increasingly popular technology are becoming more commonplace, and visual effects experts have uncovered the latest AI tricks that are helping scammers get away with their unsuspecting victims’ money, as well as how to identify them.
Indeed, advances in AI have helped all sorts of internet swindlers dupe people out of their hard-earned cash, and VFX artists Wren Weichman, Sam Gorski, and Niko Pueringer, a.k.a. The Bunkin’ Bros, have exposed these tricks in a YouTube video on their channel posted on February 16.
Clothing AI scams
The first scam is the classic “what I ordered vs .what I got” type of situation with an AI touch. As the VFX experts pointed out, there were storefronts out there using AI images (and even AI videos!) to advertise things like clothing.
But as opposed to scams advertising items that don’t exist at all (i.e., you pay for something but never receive it) here you will receive something. However, you’ll get an item that is a far cry from what the images and videos are showing you.
As an example, they showed a sweater with an elaborate wool design costing $35.99. But what the customer received was a cheap T-shirt with these intricate woolen details printed on it!
So how to tell if it’s AI? Pueringer explained that you need to “pick a skinny line anywhere in the image and try to trace it, especially along the patterns.”
“The more you look at these lines they start to not really make sense, they break up, and they become sort of embedded in themselves, it’s really ugly.”
Another pattern on the sweater actually started to match the texture of the ‘model’s’ hair because AI generators basically replicate little parts of image patterns.
Ads
Nowadays, there are also a lot of fake AI advertisements using deepfake and voice alteration tech to make it look like celebrities are selling a legitimate product or service (say a fake diabetes medicine or a magic 60-second prayer that costs $59).
The experts have shown a video of a popular TV presenter and physician Mehmet Cengiz Öz, also known as Dr. Oz. In this video, Dr. Oz seems to be talking about this amazing diabetes medicine, but his mouth is being replaced by an AI-animated mouth that’s being fed audio from text-to-speech. According to Pueringer:
“It’s been trained on his voice, so the audio is driving the voice animation, so you have this weird disconnect where he’s moving, he’s talking, but it doesn’t line up at all with what he’s saying.”
They also noticed that the ad progress bar slows down as the video goes on, but they still haven’t told you what the product is. This is because the ad creates a sense of obligation to buy the product now that you’ve already invested so much time into watching the ad.
One of the most obvious cues that this ad is completely fake is the crispness of the person’s teeth. As it happens, the teeth are always blurry, as opposed to their eyes, something that not even Hollywood CGI has yet managed to address.
‘So you’re Brad Pitt?’
That won’t impress some women much, but others may be a bit more gullible. In one of the widely publicized incidents, a woman was scammed out of €830,000 (nearly $870,000) by scammers pretending to be Hollywood actor Brad Pitt, who seduced her saying he wanted to be her boyfriend.
The story reminds a lot of the Nigerian prince needing money to get out of trouble but don’t worry – he’ll send it all back with interest – except that he won’t. This fake Brad Pitt claimed he wanted to send his victim some very expensive gifts, but there’s a problem – they’re in customs and he needs money to release the gifts.
And then it escalated to “I’m in a hospital, I have medical expenses you have to pay for,” because he had “kidney cancer” but his ex-wife Angelina Jolie “froze his bank accounts” amid divorce proceedings. The scam was accompanied by doctored images and AI videos (even one of a news segment that allegedly covers the relationship with the victim).
California wildfires
Finally, the recent devastating wildfires in California have triggered a wave of AI-generated apocalyptic scenes exaggerating the scope of these fires to unprecedented levels, including the Hollywood sign on fire and what looked like essentially a volcano splatter over some houses.
The VFX artists noted that some of these videos looked high-quality and convincing but the scale of it was way off, as “AI can never get the scale of fire correct.” While they observed it was okay to have such videos to demonstrate the power of AI, using them to try and show how real fire looks is disingenuous.
Stay alert for AI scams
All things considered, AI has advanced so much that it’s becoming increasingly more difficult to tell reality from fiction apart and easier for scammers to dupe impressionable people, thus making AI scams rampant. This highlights the need to stay ever-vigilant, double-check everything, and trust your gut if anything seems off if you want to keep your money. Good luck out there.