How to Evaluate Online Reviews


Online reviews have become the default way we make purchasing decisions. Restaurant, plumber, laptop, hotel — we check the reviews first. Which makes sense in theory. In practice, review ecosystems are messy, manipulated, and often misleading.

I’m not saying reviews are useless. They’re not. But reading them well is a skill, and most people haven’t developed it. Here’s how to extract actual value from online reviews without getting fooled.

The Fake Review Problem

Let’s start with the uncomfortable truth: a significant percentage of online reviews are fake. Estimates vary, but studies consistently suggest that 30-40% of reviews on major platforms are either entirely fabricated or incentivised (meaning the reviewer got a discount, free product, or payment in exchange for a positive review).

Amazon has this problem. Google has this problem. Yelp, TripAdvisor, and pretty much every review platform has this problem. The economics are simple: a product with 4.5 stars sells dramatically better than one with 4.0 stars. That difference is worth thousands of dollars, which makes investing in fake reviews profitable.

How to spot fake reviews:

  • Vague, generic language. “Great product! Exactly what I needed. Would definitely recommend!” tells you nothing and could apply to literally anything.
  • Clusters of positive reviews on specific dates. If a product suddenly gets 30 five-star reviews in three days after months of silence, those are probably purchased.
  • Reviewer profiles with only five-star reviews. Real people leave a mix of ratings. Someone who’s given 50 products five stars is likely a paid reviewer.
  • “Verified Purchase” doesn’t guarantee authenticity. Companies send free products to reviewers who then return them after leaving a review. It shows as “verified” but the review was arranged.

Tools like Fakespot and ReviewMeta can analyse Amazon reviews and flag suspicious patterns. They’re not perfect, but they add a useful layer of analysis.

The Selection Bias Problem

Even genuine reviews are biased. The people most likely to leave reviews are those who had extremely positive or extremely negative experiences. The vast middle — people who had a perfectly fine, unremarkable experience — rarely bother to write anything.

This creates a bimodal distribution that doesn’t reflect reality. A restaurant might have mostly five-star and one-star reviews when the typical experience is a solid three or four stars. The reviews make it look polarising when it’s actually just decent.

How to adjust for this: Pay less attention to the extremes and more attention to the patterns in three and four-star reviews. These tend to be the most honest and balanced. They come from people who liked the product or service enough to acknowledge its strengths but were thoughtful enough to note genuine shortcomings.

Read the Negative Reviews First

This is counterintuitive but incredibly useful. Negative reviews tell you more than positive ones because they reveal specific problems. A one-star review that says “terrible product, waste of money” is useless. But a two or three-star review that says “works well for daily use but the battery drains fast when using the camera” is genuinely valuable information.

When reading negative reviews, ask yourself: is this problem relevant to me? A review complaining that a laptop is too heavy matters if you travel. It doesn’t matter if it sits on your desk. A review criticising a restaurant’s noise level matters if you want a quiet dinner. It’s irrelevant if you’re going with a large group.

The most useful negative reviews describe specific scenarios where the product or service fell short. Ignore the emotional rants and focus on the detailed critiques.

Star Ratings Are Nearly Meaningless

Here’s a hot take: the overall star rating of a product tells you almost nothing useful. On Amazon, the difference between 4.2 and 4.4 stars is functionally meaningless. Both indicate “generally good with some complaints.” The number of reviews matters more than the rating — a 4.3 with 2,000 reviews is far more reliable than a 4.8 with 15 reviews.

Google Maps restaurant ratings have experienced massive inflation. In most cities, anything below 4.0 stars is considered bad, which means the useful range is compressed into a 4.0-5.0 band. A 4.2 restaurant and a 4.6 restaurant might be close in actual quality, just with different review profiles.

What to look at instead: Read 10-15 recent reviews. Look for consistent themes. If multiple independent people mention the same positive or negative thing, that’s probably real. If one person complains about something nobody else mentions, it’s probably an outlier.

Recency Matters

A restaurant with glowing reviews from 2022 might have changed chefs, owners, or management since then. A software product praised in old reviews might have gone through updates that changed everything. A hotel renovated two years ago might still have old negative reviews dragging down its rating.

Filter by most recent when possible. What’s happening now matters more than what happened two years ago.

Platform-Specific Issues

Each review platform has its own quirks:

Google Maps: Heavy positive bias. Businesses actively solicit five-star reviews from happy customers. The overall ratings trend high.

Amazon: Rampant fake reviews, especially for electronics and beauty products. Use third-party tools to filter.

Yelp: Controversial filtering algorithm that hides many genuine reviews. Some businesses report that positive reviews get filtered while negative ones remain visible.

TripAdvisor: Generally reliable for hotels, less so for restaurants. Rankings can be manipulated through review campaigns.

Trustpilot: Companies can invite customers to leave reviews, which skews positive. Unsolicited reviews tend to be more critical.

The Best Strategy

When making a significant purchase or choosing a service:

  1. Check reviews across multiple platforms, not just one
  2. Read 10-15 recent reviews, focusing on three and four-star ones
  3. Look for consistent themes rather than individual opinions
  4. Run products through Fakespot or ReviewMeta if buying on Amazon
  5. Search for independent reviews from blogs, YouTube, or Reddit
  6. Ask people you know in real life — personal recommendations still beat online reviews

Online reviews are one data point, not the final answer. Use them as a starting point, apply critical thinking, and make decisions based on patterns rather than individual opinions. They’re a flawed tool, but used carefully, they’re still useful.

Just don’t trust the five stars at face value. Almost nothing in life is actually five stars.