Overall, the average score and average price are the same as in 2014 ?s Top 100: 93 points and $47 — an excellent quality-to-price ratio
That the magazine’s editors could write this speaks to how screwed up scores are and to how little the Spectator understands about the relationship between quality and value. A few thoughts:
? A $47 wine should get 93 points, if only because it costs $47. What’s the point of buying it otherwise? I could just as easily buy a $35 wine that got 90 points, which offers a better dollar per points ratio (a concept that, as I write this, makes my stomach turn).
? If I owned a winery and spent the millions of dollars necessary to make $47 wine and I didn’t get at least 93 points, the winemaker’s job would be in jeopardy. Baseball managers who don’t win get fired; why not winemakers?
? True value is a $10 wine that gets 88 or 90 points, a dollar per points ratio of .11, vs. the .51 for the $47 wine (sorry — couldn’t help myself). These are the wines that score-driven consumers have been to taught to buy, and I hear from them all the time. “Parker gave that $12 wine 90 points. Do you know where I can find it?”
? No score can guarantee whether you’ll like the wine. No. 21 on the list, with 93 points, is the Cloudy Bay sauvignon blanc from New Zealand. It’s a nice wine, but certainly not my favorite New Zealand sauvignon blanc and certainly not the 21st best wine of 2015 if I was doing the ranking.
? And, in one of those peculiarly Spectator leaps of logic, the rankings list scores and boast about them but the wines aren’t ranked by scores. Rather, they are chosen for “quality, value, availability and excitement.” Excitement? Did Fred Sanford judge the wines this year?