Reality used to be a friend of mine

Deep fake videos have gotten really good, really fast. But help is on the way.

Since the first cave human got fooled into handing over a Woolly Mammoth hide for a handful of magic beans, flim-flammers and con artists have been separating their fellow bipeds from their money. The only thing that's changed is the technology they’ve used to do it. 

The Internet was probably the greatest gift grifters ever received. Scammers no longer had to get up close and personal with their victims. Better yet, they could attack millions of potential suckers every minute, trolling for people who will apparently believe (and click on) anything. 

The scam technology du jour: Live, deep-fake videos. 

Last February, a well-known UK engineering firm got scammed to the tune of $25 million by a deep fake video of someone pretending to be its CFO on a Zoom call. A few weeks ago, the Chairman of the Senate Foreign Relations Committee got duped by someone pretending to be the former foreign minister of Ukraine.  He got suspicious after the fake minister got a little tetchy about whether the Senator supported lobbing long-range missiles over the border into Russia. 

It’s used to be difficult and expensive to impersonate someone over a live Internet connection. Not any more. 

Over the last year, the cost of creating real-time deep fakes has plummeted dramatically, while the availability of tools that make this kind of chicanery relatively easy has exploded, says Ben Colman, CEO of Reality Defender, which makes deep-fake detection tools for large organizations. 

Most of the information you would need to convincingly fake being someone’s boss or a foreign dignitary can be found via the Interwebs, usually on that person's own social media profiles. All you need to do is take an image of that person off the Internet, use a face-swapping app to map it onto your face, use a voice clone app to make you sound like the person whose face you stole, and voila – you, too, could be $25 million richer or have the opportunity to bamboozle a sitting US Senator. 

Caught in a bad romance

And while most targets of these scams are deep pocketed organizations or highly connected politicos, deep face scams have recently gone retail. A group of Nigerian scammers calling themselves the Yahoo Boys are using deep fake videos to conduct romance scams, while also brazenly posting videos of themselves in the act of catfishing their victims. 

Tireless fraud fighter Frank McKenna (aka Frank on Fraud) attended a “Yahoo Boys School” to find out more about how this scam operates. His answer: Like a well-oiled fraud machine: 

Yahoo Boys are creating and selling their own applications directly to scammers or using popular face-swap apps like XpressionCamera to perpetrate the schemes….[and] selling complete packages of stolen images, and deep-fake audio and video that can be used to fool victims. 

The FBI estimates that romance scammers bilked nearly 18,000 victims out of more than $650 million in 2023 alone. While deep fake videos account for a tiny fraction of successful romance scams today, that’s only going to increase.

Who's Zoomin' Who?

The good news is that Reality Defender has just released a tool that can scan video images in real time and detect fakes as they happen. Colman likens it to “real-time antivirus for media and communications.” I like to think of it as Good AI vs. Evil AI. 

The bad news is that you or I can’t use it – directly, anyway. The tool is only available to commercial customers: media companies, social media platforms, AI video and audio generation platforms, major telecoms, and government agencies [1]. These organizations have a vested interest in detecting deep fake videos at the source and preventing them from spreading, which makes them the customers most likely to pay serious coin for it. [2]

Using tools to stop deep fakes at the source will probably work most of the time, just as good spam filters and modern anti-malware software stop 99 percent of the bad stuff. Then the bad guys will find new ways to evade the filters, and the endless cat-and-mouse game will continue. 

Inevitably some will get through and find their way to your Qanon-obsessed aunt's favorite "I've done my own research" social media platform. Technology alone can’t solve this problem. It takes more intelligence on the receiving side of the video call. 

I know what you’re thinking: “I’m waaay too worldly and sophisticated to fall for a deep fake video.” Well, maybe not. You can no longer look for telltale clues like people with six fingers on each hand or faces like gargoyles. The technology has advanced incredibly rapidly in a very short time. 

Colman says that if the PhDs on his staff can no longer identify deep fakes with the naked eye, consumers don't stand a chance. Worse, these scammers know how to prey on your biggest vulnerabilities. 

“I'm a co-founder of one of the leading platforms in the space,” he told me recently. “But if I hear my 7-year-old on a phone call saying, 'I'm in trouble, and can you wire money?' And someone else saying, 'He's okay now, but he might not be okay in 30 minutes,' I will think twice. I mean, is it worth the risk that maybe the threat is real?”

Don't Stop Start Believin' 

We have reached an era where we can no longer believe everything we hear or see, no matter how convincing it might appear. (Thanks, AI.) The smart approach is to never trust, always verify, says Colman. Try to determine where those videos are coming from before sharing them with your 200 million followers. [3]

How can you tell whether that person pretending to be your boss or new BFF is really Vladimir from Vladivostok? Try asking questions that can't be answered by scanning their social media profiles. Like, When was the last time you both spoke in person? What did you talk about? Do they really have a tattoo on their butt that looks like Steve Buscemi? 

And if some smooth-talking charmer named Brad Clooney suddenly wants to get jiggy with your mom, it's probably time to take away her Zoom privileges. 

 Have you ever been duped by a deep fake? Fess up in the comments or email me: [email protected].

[1] Hopefully, one of those agencies is the US Senate. 

[2] Kind of like when your phone carrier tells you it’s a spam call before you pick up. (Or, at least, that’s what you keep claiming, Cherisse. Why won’t you pick up when I call? Why?)

[3] I'm talking to you, Elon. Not that you care. 

Reply

or to participate.