• The Tynan Files
  • Posts
  • Did that product really earn its five-star rating? Probably not.

Did that product really earn its five-star rating? Probably not.

Fake online reviews are costing consumers more than $300 billion a year

Snake oil salesman (complete with snake). Source: Midjourney.

Let’s say you’re shopping for a plumber or a doctor online. You’re a smart consumer. You scan their reviews, see what hundreds or thousands of other customers have to say about them, weigh your options, and make an informed decision.

In other words, you rely on the wisdom of crowds [1].

The problem? Many of the people writing those reviews aren’t people. Or at least, they aren’t the people they claim to be. Hundreds of millions of online reviews have been generated by troll farms half a world away, hired on the cheap by companies trying to get you to buy their products or services. And now these reviews are starting to be generated by AI.

Product and service reviews fall into the ever-expanding category of ‘This is why we can’t have nice things on the Internet.’ Last week,The Transparency Company published a detailed report on just how prevalent fake reviews are, and the harm they’re causing consumers. 

Source: The Transparency Company.

The report looked only at reviews of home services such as moving companies [2] and locksmiths, as well as medical and legal services. It only looked at reviews published on Google Maps, not Amazon or Yelp. And yet it still found that fake positive reviews cost consumers up to $300 billion annually by duping people into paying higher prices for substandard or even fraudulent services. 

Multiply those numbers by all the fake reviews across the Internet, and you start to see the scale of the problem. 

Transparent, but not invisible

I recently chatted with Curtis Boyd, who founded The Transparency Company roughly a year ago and has been taking down false online reviews since 2019. Since he started, he has managed to get more than 450 million fake reviews deleted from the Google Maps directory.  

Busy man, that Curtis. 

Over time he and his small team developed a methodology to identify fakes that looks at more than 170 signals, such as authorship style, keyword stuffing, copying and pasting from other reviews, excessive use of exclamation points, frequency and pattern of submissions, and much more. 

Using the strictest possible standards, the report found that 14 percent of the 73 million reviews it examined were obvious fakes. Curtis admitted that the actual number of fakes was probably double that, but he wanted to be as conservative as possible. And that’s only taking into account false positive reviews, not negative ones sometimes planted by a business’s competitors.

“Fake reviews are a multi-billion dollar business,” he says. “It is really bad out there.”

Boyd is also seeing a spike in AI-generated reviews. Though only 2 million of the reviews in the report were written by bots, that number is increasing by 80 percent per month.

Later this spring The Transparency Company plans to introduce a badging program, similar to the Good Housekeeping Seal of Approval, that assures consumers the positive reviews they see on service providers’ sites are genuine. And they’re hoping to introduce an app shoppers can use before the end of the year.

In the meantime, it’s buyer beware out there. Even if a review is legit, how do you know you can trust it?

Pull over to the side of the Internet and show me some ID

I have long held that the root of nearly all Internet evil lies in the ease with which people can create fake identities and wreak all kinds of havoc. Great power with zero responsibility – what would Spiderman’s Uncle Ben have to say about that?

If Al Gore had come to me 30 years ago and said, Please Dan, help me build a better Internet, this is what I would have told him [3]: 

We need trustworthy federated identity services. We need the ability to create and maintain a consistent digital entity across every site we visit and every service we use. [4] You could call yourself ThroatWarbler Mangrove1717 or Pusillanimous Pussycat123 – the name itself doesn’t matter; what matters is that the identity remains consistent from one site to the next. Because that’s the only way to establish a reputation system that weeds out bad actors. 

Right now, if someone acts like an asshole online, and there’s some kind of moderation system in place (usually not), they might find themselves eventually locked out of that online community. Today they can simply spin up a new pseudonymous identity and continue being an asshole. If that online community had a federated identity system in place, they couldn’t. 

If people knew they’d be held personally responsible for the things they say and do online, I think we’d all be in a much better place right now. We’d be wading through far less bullshit, and dealing with a fraction of the fake news we’re now drowning in. And that applies to online reviews as well.

Of course, the ideal ID system would need to be able to protect the identities of vulnerable populations (like trans or gay people) without giving a free pass to bad actors (like criminals and professional trolls). And it would need to be resistant to potential abuse by law enforcement. I’m not saying it would have been easy; I’m saying that it would have been worth the effort, had we thought about it back then.

But did Al ever ask? No. And look where we are now.

Do you trust reviews you read online? Or anything else? (Besides this newsletter, of course.) Share your skepticism in the comments or email me: [email protected].

[1] New Yorker writer James Surowiecki coined that phrase in his 2004 book of the same name, and everyone went ‘Whoa, the Wisdom of Crowds, what a great concept.’ At the time I thought, ‘If this man had spent more than five minutes on the Internet he’d have written a book called The Stupidity of Mobs.’’ 

[2] Don’t know about you, but I’ve had a couple of truly excruciating experiences with moving companies. Buyer beware squared.

[3] I would also have said, ‘Geez Al, lighten up a little bit, you’re bringing everybody down. Tell a few jokes once in a while, and do something about that hair. You look like you just stepped out of a Brylcreem commercial.’ 

[4] This would actually be a great use for blockchain technology, which could ensure that identities remained consistent, unalterable, and anonymous. Yes, I am a nerd. 

Reply

or to participate.