People use Untappd in a variety of different ways. Some check-in to every beer while others only go after uniques and use it more like a beer journal. Some people never rate the beers they’re having, and others give every beer a star rating. In our most recent sample of over four million check-ins 83.5% of beers received a rating -- the rest just got a check-in.
Does the percentage of ratings a beer receives have any relation to the quality of the beer? I grouped everything by the percentage of check-ins, with a minimum of 1000 to qualify. I calculated the average rating for each and plotted them on a graph. As you can see, users tend to rate good beers more frequently.
We can look a little closer though. The average rating of this sample of beers is 3.55. This is slightly less than the rating of the raw sample because I stripped out beers with less than 50 check-ins for sample size reasons. But it’s a good baseline.
You’ll notice on the graph above that the average rating of 3.55 matches up pretty well with the 83.5% rated line. Above 83.5% the beers seem to jump in quality. They also jump in check-in density, which is not featured.
The top five percentiles are all 85-89, with 88 leading the way with with over 500,000. This doesn’t seem to represent a steady progression though, more like a jump. Above the 83% average all the blue dots jump up into the next section, suggesting that drinkers tend to award better beers a rating more often and also that they continue to then drink those beers more often, contributing positively to the beer's percentage of ratings. This is a logical conclusion: Good beers are delicious, so people drink and rate them.
Also interesting is the dropoff on lesser rated beers. The quality of the beer drops off significantly as fewer and fewer users rate it, until it drops off the graph altogether. This is the inverse conclusion; users don’t rate bad beer as often and even when they do they don’t often drink it frequently.
The interesting outliers are those beers rated 99 and 100% of the time. The 99% beers didn’t even have enough check-ins to make the graph, and the near perfect, or perfect, beers at 100% have a distinct drop in quality. Perhaps it’s a factor of beer hype in that people seek out highly-thought-of beers and end up being disappointed when it doesn’t live up to the reputation they had in their mind, but they still want to weigh in on what they thought of it. More likely it’s just that it’s impossibly hard to have a beer that’s rated nearly every time.
This line on the graph represents nearly 57,000 distinct beers while most of the other percentages only have 2-3 thousand. This appears to be describing a different subset of beers. Most of these beers only receive a bare minimum of check-ins making the results wildly unpredictable because often times they’re being rated by one or two people. Any beer that’s good enough to continue to be purchased is going to drop down into lesser percentiles, as it’s really hard to maintain a perfect game.
The conclusion here isn’t groundbreaking, but it’s still an interesting one as we continue to learn about what we can get out of Untappd check-ins. Users tend to rate and continue to drink good beers more often than bad ones. The beer with the highest percentage of check-ins, with a minimum of 100, is RJ Rockers Light Rock Ale with a BAR rating of 9.08 and it is the only beer 100% rated. On the opposite end, only 61.3% rated, is Hogs Back Brewery TEA (Traditional English Ale) coming in with an astounding -5.57 BAR.