Quantcast

Last Year's Version

Larry Koestler, January 16, 2014 -   

Spurred on by Eno's post from the other day, I got to thinking about whether one can accurately measure something like a DIPA being better or worse than a previous year's iteration.

In particular, I've seen a fair number of people claim that this year's Lagunitas Sucks (and by this year's I'm referring to 2013 since it came out in December) isn't as good as last year's, but how can anyone really remember that? You last drank 2012 Sucks one year ago, and unless Lagunitas magically furnished you with a fresh bottle of a 12-month-old DIPA, you can't ever do a side-by-side. I suppose the people that took detailed notes on the previous year's iteration can back their claims up, and on a batch-by-batch basis for a year-rounder, comparison is certainly fair game. But playing the "not as good as last year's" game with annuals (I'm certainly guilty of this myself) -- unless you have otherwordly palate recall or the brewery specifically announced changes to the recipe (or something went very, very wrong during the brewing process of a given annual resulting in infection/off flavors) --  is really just subjective noise (and yes, I realize I am also describing all beer reviews).

That doesn't mean that people can't/shouldn't do it. Beer geeks want what they can't have. And typically when you're particularly passionate about a given hobby you also enjoy ranking items within that hobby (otherwise you wouldn't be reading this). Nothing beats that first time when you've finally tracked down that beer that you've been moving mountains to get a hold of. In my experience, people infrequently seem to think of a beer they hold in high esteem as being as good or better on subsequent visits to it -- and you could probably apply that statement to just about anything, not just beer -- and so we often find ourselves with unrealistic and unmet expectations.

There are a few other factors at play here as well: this post began as a conversation with my fellow Beergraphers, and Eno noted that part of the reason he may have perceived this year's Sucks to be "not as good" is the rapidly shifting beer landscape. There have never been more choices available to the craft beer drinker, and while not all of them are great, a lot of them are.

The perfect case study for this particular conundrum is Pliny the Elder. Say you are newish to craft beer, live on the east coast and quickly get sucked in to a Beer Advocate rankings-page hole. One of the first things you'll see is the fact that Pliny the Elder is the third-highest ranked beer in the world. So maybe you work out a trade for some Pliny. Unfortunately, Pliny -- like many of the highest-rated beers -- has an extremely small geographical footprint, and if you manage to swing a deal for a bottle, you have little control over how fresh the bottle is, as well as how well it held up during a cross-country voyage its brewer never intended it to make. And you drink it and think "this is pretty good, but third-highest rated beer in the world good?" And that is in small part due to having to trade to acquire it, but more and more due to the fact that there are SO many brewers out there now making REALLY great DIPAs that (a) can probably be obtained without having to take a trip to UPS/FedEx, and (b) are brewed close enough to where you live that you are ideally experiencing them as freshly as possible. Pliny continues to rank well because it's still a wonderful DIPA (or so I've been told; I've yet to have it on tap), and by virtue of being among the first of its kind likely won't be plummeting in the ratings any time soon since BA's algorithm prioritizes (as it should) number of ratings. Though I suspect things may look a bit different at the top in another five years as more and more breweries enter the fray, distribution widens, the number of people switching to craft beer continues to increase, and palates evolve.

Let's look at the data to see if we can get a sense of how beer drinkers actually do rate subsequent iterations of a given beer. This is far from a perfect analysis, as we are subject to the whims of Untappd users with regards to both the creation of unique vintage profiles for certain beers (Untappd has its own set of guidelines on this, although they're not always strictly adhered to) as well as user behavior ("Do I check in to the vintage, or into the beer's main check-in page?")

Here's Lagunitas Sucks:

sucks

There are a couple of logistical issues here -- for starters, by Untappd standards Sucks shouldn't have separate entries for vintages (despite being an annual release) since Lagunitas doesn't indicate a year on the label and the recipe hasn't changed. Also, unless you are a fiend for categorization, you're probably checking into the main check-in profile for Sucks since it's the first one that turns up when you search Untappd for "Sucks." The ABV is also incorrect for 2013 and 2014 (it's 7.85%). And lastly, in my view "Lagunitas Sucks (2014)" shouldn't have been created until the end of this year, as the bottles that are on shelves now were brewed in 2013. However, the "Lagunitas Sucks (2013)" entry likely contains check-ins from last January, so we have a fairly muddled data pool. If we go strictly by what's in front of us here, it does appear as though folks are enjoying the latest version of Sucks slightly less.

Here's another beloved annual release, Sierra Nevada Ruthless Rye:

Ruthless

Ruthless appears to be bucking the "last year's version was better" trend.

And finally, I thought I'd take a look at Stone's universally beloved Enjoy By series.

enjoyby

And here's a chronological graph showing the variations in Enjoy By's reception:

beergraphs_enjoyby

Despite the recipe never changing, almost every iteration of Enjoy By has a different Style+ (the average of all 14 editions is 106) and BAR (though the latter is due in part to Stone's calculated limited-distribution marketing plan, which appears to be going out the window with the 02.14.14 release). This is mostly due to the logistics of how you enjoyed your Enjoy By -- if you bought a bomber, how quickly did it get to your market and when did you crack it open; was it on tap; at what point during the ~35-day "enjoy by" period did your local bar decide to tap it, etc. -- along with the inevitable, if unwarranted, comparison to previous batches.

I think we'll get a better sense of how peoples' perceptions of a given beer might change on a year-by-year basis once BeerGraphs has enough data to start doing beer player pages.

As far as this latest edition of Lagunitas Sucks goes for me? I am hard-pressed to find anything about it that I like less than last year's iteration, and I can be excessively tough on IPAs and DIPAs as the two styles comprise ~99% of my beer consumption. I am loving this year's Sucks so much that I wish it was available to me year-round, as it would easily be a top-five "available to me for purchase in Texas" DIPA.

More from me: My Top 18 Beers of 2013 and an even screedier screed on freshness

Larry Koestler is a craft beer evangelist and freshness zealot who spends his free time scolding others for purchasing old IPAs. You can follow him on Twitter and Instagram, and connect with him on Untappd and Beer Advocate.

comments powered by Disqus