Why Are Some Yelp Reviews Not Recommended?
Business owners are often frustrated to find positive reviews from their customers filtered out by Yelp. Why does this happen, and what can be done about it?
If you’re a business owner or manager, there’s a decent chance that you’ve spent some time obsessing over your Yelp reviews. (If you’re not paying attention to your reviews, you should be—digital PR is a pretty significant facet of Internet marketing.) And if that’s the case, then you’ve probably fumed over every positive review that’s been condemned to the purgatory known as Yelp’s “not currently recommended” section.
For those who aren’t aware, if you scroll down to the bottom of a business’s Yelp page, you’ll see light gray text that says “X other reviews that are not currently recommended.”
“Not recommended reviews” are reviews that have been filtered out by Yelp and not counted.
If a Yelp visitor chooses to dig deep and read them, they can. But these reviews are hard to find, and they don’t contribute to the business’s Yelp rating or review count.
According to Yelp, their algorithm chooses to not recommend certain reviews because it’s believed that the flagged review is fake, unhelpful, or biased. However, some reviews are filtered simply because the reviewer is inexperienced, not attuned to the tastes of most of Yelp’s users, or other concerns that have nothing to do with whether the Yelp users’ recounting of their experience is accurate. Yelp says that roughly 25% of all user reviews are not recommended by the algorithm.
While Yelp at least admits that reviews may be filtered out simply because the reviewer isn’t a frequent Yelp user, there’s still a lot that’s unclear. That’s a problem, because the vagueness of this process could provide sufficient cover to conceal bias or unethical behavior on Yelp’s part.
What actually determines whether a review is flagged by Yelp’s filtering algorithm?
If you take some time to scroll through a few Yelp pages, you’ll see reviews left by people who have written 1 review and don’t even have a profile image, while reviews from more established Yelp users end up in the dustbin.
Why is the review above written by a user with 52 friends and 5 reviews filtered out, while the one below makes the cut?
I’ve had to deal with Yelp issues when working with clients, and have written about Yelp in the past (see my article on “Dealing With Fake Reviews on Yelp”). In contemplating the many frustrating issues with Yelp, I’ve long wondered if it would be possible to determine whether a given review would be more or less likely to be filtered out by Yelp’s algorithm.
The core of that question is, what does Yelp consider to be the critical components of a review’s trustworthiness? I decided to try and find out.
What Yelp doesn’t want you to know, and for good reason.
There is a key limiting factor in any analysis of visible versus filtered reviews—you cannot look at the user profile of someone whose review is not recommended. It’s not clickable. So any comparison of the two classes of reviews can’t incorporate in-depth profile data—time spent on Yelp, their “Things I Love” list, whether a person has chosen a custom URL for their Yelp profile, etc.
There’s a reason for that: Yelp doesn’t want people to know how their algorithm works. If we knew exactly how it worked, then we could game it. This significantly hampers the ability of an outsider to penetrate the machinations of Yelp’s algorithm.
However, there are a few things of which we’re pretty certain, but which we can’t analyze in a meaningful way.
First, do not ask customers to write a Yelp review while they’re at your business. Yelp’s site and mobile phone application can easily determine your physical location when you submit a review. If you submit a review while you’re at the business’s location, Yelp will be able to tell, and it’s extremely likely that they’ll filter the review. A good way to get around this is to send a follow-up email to the customer a few days after you assist them, asking them how their experience was, and to leave a review for you if their experience was positive.
This brings us to our second point: Don’t provide customers with a direct link to your Yelp page if you’re asking them to review your business. Yelp can look at the referring domain name, and if they see that the person reviewing Doug’s Fish Tank Shop was referred by dougsfishtanks.com, they’ll know that the customer was specifically referred by you and filter the review. Instead, simply say, “Please visit Yelp.com, look up our business, and leave us a review.” This eliminates the suspicious linking that would otherwise lead to the review being filtered.
Lastly, the timing of reviews matters. If you hold a day-long promotion during which you offer customers some sort of deal if they leave a positive review for your business on Yelp, what Yelp is going to see is that a business which had previously received only a handful of reviews over several years is suddenly getting multiple reviews on the same day. That’s going to look just a teensy bit suspicious, and all of those reviews are going to end up on the Island of Misfit Reviews, along with all of the other filtered reviews. If you insist on having some sort of promotion, spread it out over time. Make it part of your standard follow-up communication, as suggested above, rather than some sort of special event that immediately puts Yelp on high alert.
We can’t attest to the above based on statistics and analysis, because Yelp keeps that data in the digital equivalent of Fort Knox. But based on experienced, we can infer that the above is true.
As a result, my analysis is restricted to only the data that’s publicly available to anyone who decides to spend a few hours crawling through Yelp with an Excel spreadsheet. Essentially, my analysis assumes that all of the reviews that I looked at were left by honest, earnest individuals who weren’t coerced by business owners or acting on agenda.
So, with that assumption in mind, what determines whether a Yelp review is filtered?
Designing a data analysis of Yelp reviews.
Ultimately, I chose to focus on five review variables: Whether the review’s writer has a profile image, the number of friends they have, the number of reviews they’ve written, how many photos they’ve submitted, and the rating of the review. I opted to choose five random businesses in the local Sacramento area: a restaurant, auto repair shop, plumber, clothing shop, and golf course. I would collect this data for each and every visible and filtered review for these five businesses, and see if the comparisons made anything clear.
Now, there is a complicating factor for any comparison of Yelp reviews. In a phrase: power users. While many Yelp users are very low-key, only leaving a small handful of reviews, instead primarily using the service to view reviews left by other users. But there is a small a coalition of super users who contribute a LOT of reviews to Yelp.
This isn’t an issue when it comes to looking at an average rating score. Whether you’re a super user or a novice, your ratings get equal weight (unless your review gets filtered out), and a single rating can’t swing things much, because there’s a maximum of 5 stars.
But when it comes to variables that don’t have a maximum value, things can get a little crazy. For instance, one business I looked at had 33 reviews. When I took a look at how many photos each user had previous submitted to Yelp, I found that while most users had contributed zero or very few photos, one user had submitted 9,452 photos to Yelp. Look at this graph of each user’s photo count, it’s absurd (keep in mind that the scale of the graph maxes out at 1,000 photos):
This presents a serious problem. A single person skews the average to an absurd degree—only two users’ photo counts exceed the mean. It’s like having the valedictorian in your math class. They completely wreck the grading curve.
With this in mind, for all other variables besides Yelp rating, I chose to use the median average. For our purposes, a median is really useful because it gives us a number where half the users in a group fell below that number, and half of the users are above it. The median is the proverbial C student, smack in the middle of the demographic.
With this in mind, the analysis below relies on the mean averages of user ratings, and the median averages of photo counts, review counts, and friend counts. I also compared the percentage of visible reviews that were left by users who set profile images, versus the number left by users who didn’t do so.
In each comparison, the first figure will be from the visible reviews, while the second will be from filtered reviews.
Average Yelp Rating
This wasn’t nearly as exciting as I expected (and hoped) it would be.
- Restaurant: 4.1 vs 4.1
- Auto Shop: 4.4 vs 4.1
- Plumber: 4.4 vs 4.1
- Clothing: 4.3 vs 4.9
- Golf Course: 3.2 vs 3.6
In two cases, the average score for visible reviews was greater than that of filtered reviews. In one case, they were equal, and in two cases, the filtered reviews’ average rating was greater than the visible review ratings.
This is the sort of fairly random distribution that you would expect if Yelp’s algorithm didn’t take the rating into account. Basically, Yelp isn’t stealing your 5 star reviews.
Percentage of Users with Profile Images
These days, social media has a huge impact on business and culture. Consequently, it has become imperative to understand who a user is in order to have a better understanding of their viewpoint.
With this in mind, it’s easy to see how Yelp might be more mistrusting of an anonymous user who doesn’t add a profile photo, versus someone who does. And it appears the data supports this supposition.
- Restaurant: 71% vs 50%
- Auto Shop: 67% vs 62%
- Plumber: 49% vs 22%
- Clothing: 88% vs 33%
- Golf Course: 88% vs 60%
There is a very clear trend here. With the auto shop the difference is pretty small, but in every case visible reviews were more likely to have profile images associated with them. Aside from the auto shop, there was 21 point or greater gap between in the use of profile photos in visible reviews and hidden reviews.
Within my data sample, the overall percentage of visible versus filtered reviews with profile images was 71% versus 45%. The pretty clear takeaway from this is that the presence of a profile image does have an impact on review filtering.
Number of Yelp Reviews Posted
The number of reviews posted by a Yelp user does appear to significantly impact the visibility of their reviews.
- Restaurant: 6 vs 1.5
- Auto Shop: 7 vs 1
- Plumber: 7 vs 2
- Clothing: 10 vs 2
- Golf Course: 36 vs 2.5
The difference here is pretty stark. The plumbing business had the smallest gap, and even then the median visible reviewer had posted 3.5 times the number of reviews as the median filtered reviewer.
Looking at the raw data seems to reinforce the conclusion that review count is strongly factored into Yelp’s algorithm: the five highest review counts for the 66 filtered reviews I looked at were 59, 20, 19, 9, and 9—only 4.5% of reviewers with more than five reviews were filtered. Once a user’s review count is in the high single digits—unless they’ve done something to make Yelp really cranky—their reviews are almost guaranteed to show up.
In our personal experience, we have seen reviews which had been filtered for months or years suddenly released from purgatory without explanation. Based on the data above, it seems likely that the reviews were unfiltered when users finally posted enough reviews to make Yelp happy.
The takeaway here is to not just encourage your customers to leave reviews for you, but to do so for other businesses as well; to be more active in reviewing their local community’s businesses. Once they get past a total of about 6 or 7 reviews, it’s very likely that all of their reviews will survive the algorithm’s wrath.
Number of Yelp Friends
It appears that the number of Yelp friends that a user has also impacts the visibility of their reviews, but the correlation is a bit noisy when you dig deeper.
- Restaurant: 15.5 vs 2
- Auto Shop: 1 vs 0
- Plumber: 0 vs 0
- Clothing: 7 vs 0
- Golf Course: 7 vs 0
In looking at the averages, there’s definitely a gap. However, in looking at the raw data, 10 of the 66 filtered reviews were written by users with 20 or more Yelp friends, with 7 of them having more than 35 friends. A pretty significant chunk of the filtered reviews were written by social butterflies
It appears that while the friend count does have some impact, it’s not nearly as determinative as the other factors described above. The takeaway is that having Yelp friends helps, but can be outweighed by other factors
Number of Photos Posted
On the surface, the number of photos posted by Yelp users doesn’t appear to have a profound impact on review visibility...
- Restaurant: 4.5 vs 0.5
- Auto Shop: 7 vs 0
- Plumber: 0 vs 0
- Clothing: 0 vs 0
- Golf Course: 2 vs 0
Obviously, there are no instances in which users with filtered reviews averaged more photos than those with visible reviews. But for two businesses, the medians were both were 0, and the golf course comparison isn’t terribly compelling either.
However, the raw data tells a very interesting story: there are very, very few filtered reviews posted by users with significant photo counts. Of the 66 filtered reviews, the top five photo counts were 26, 21, 6, 5, and 2. That’s a seriously drastic fall off. 94% of the filtered reviews were posted by users that had submitted 2 or fewer photos to Yelp.
The takeaway here is that while a lot of Yelp users don’t post photos, posting even a small handful of photos has a pretty good likelihood of getting a user’s reviews out of purgatory.
The Final Analysis of Our Little Yelp Experiment
To compact the couple thousand words or so above into something short and sweet, here’s what I think. First of all, I don’t see evidence that the rating of a review has an impact on whether a review is filtered or not.
Secondly, the other factors in play all definitely have some sway on whether a review is filtered. If I were to rank these four variables in terms of importance, taking into account a user’s time investment (it’d be great if every user wrote 10 reviews, but that takes a lot of time), this would be my ranking:
- Profile Image
- Photos Submissions
- Number of Reviews
- Number of Friends
Setting a profile image and uploading a couple photos of a business requires very little time, and the data indicates that these have a significant impact on the likelihood of a review being visible. After that, the quantity of reviews is very important, but the magical threshold where you’re almost guaranteed to not be filtered is fairly high—around 6 to 9 reviews. The number of friends matters as well, but doesn’t outweigh the factors above (and for users who aren’t inclined to socialize on Yelp, it’s going to be tough to convince them to do otherwise).
The purpose of this analysis was to provide some actionable advice for business owners. So, if you’re managing a business and you want all of your reviews to show up, the data suggests that you don’t need to target Yelp super users. You just need to encourage your loyal customers to not just write a positive review, but also to take a couple minutes to add a profile image and take and submit a couple photos of your business. Then, maybe nudge them to leave reviews for other businesses as well to get their review count up. These little extra actions can significantly heighten the odds that their review of your business will show up.