Scores for wines: it’s getting silly

business of wine

Scores for wines: it’s getting silly

On balance, I’d rather have scores for wine than not. In a critical assessment of wine, given that we have such an impoverished language for tastes and smells, a score helps a reader know exactly how good the taster considered the wine to be. That’s helpful. But the 100 point scale is beginning to get a bit silly.

This 100 point scale is now the default scale for assessing wines. Some brave souls persist with other systems, but pretty much everyone else scores out of 100.

Along those lines, while it would be nice to make use of the whole scale, everyone calibrates by Robert Parker. But there’s been some subtle grade inflation, such that these days, for decent wine, the scale really begins at about 85, with most of the action taking place in a narrow band of scores from 88-95. The source of this grade inflation? Well, guess which rating will be used on a shelf talker? The highest, of course, so there’s a subtle pressure on the major publications to be more generous with the points in order to benefit from the free advertising that the use of their endorsement in a retail setting brings.

It has got a bit silly. What sort of world is it where 89/100 is considered a fail? But it is for aspiring winegrowers: to be taken seriously, you need a 90+ score. And while a score of 90 used to sell wines, now you really need to be comfortably in the 90s for your wines to shift, unless they’re cheap.

This means all the scoring action is compressed around the high end of the scale. For the most serious wines, we’re talking about ratings on effectively a 5-point scale, from 95-100.

Is there any point following critical ratings of Bordeaux 2009 for the top properties? Given the uncertainty of tasting from young barrel samples, the fact that critics have to use a range with these young, possibly not representative samples, and the fact that the scoring space at the top end is compressed so much, I guarantee that all the first growths and most of the super-seconds will get very similar scores (95-98, 96-99, or 97-100, for example), thus yielding very little useful information at all.

What of the future? Will all top wines rate within just a few points of each other? Will some critics end up extending the 100 point scale? Will some resort to using half points, to buy them some extra room to manoeuvre? It’s all a bit nuts.

16 Comments on Scores for wines: it’s getting sillyTagged
wine journalist and flavour obsessive

16 thoughts on “Scores for wines: it’s getting silly

  1. Oh dear – this looks like a variation on the same theme as the recent Parker discussion! Frankly, whenever I see points mentioned (especially of the /100 variety) I get worried. I’ve tasted enough high-Parker-scoring wines in my time to know that if he (or do I mean He?) scores something very highly, I’m probably not going to like it that much. And to be honest, Jamie (and I am very definitely not trying to be unkind, here – or even score points ;-)) you seem to be scoring your tasting notes ever higher these days. When I am looking wines or growers up on the Internet, Google often takes me to your write-ups, some from a good few years ago. I should add that I find them very useful and valuable. But the evidence (albeit very anecdotal) seems to suggest that your scores have become more generous, to the tune of a good 2 or 3 points, over the last few years.

    Take the Seresin wines, all of which you scored between 93 and 95 points. Are all of them REALLY that good – so good as to be just a few notches short of the greatest wines ever produced in the history of winemaking. I’m not saying they aren’t (I haven’t tasted them) but your scores don’t leave much room for error. Furthermore, they might be worth 93-95/100 Goode points, but are they worth 93-95/100 Parker points (not that I care) or 18.5-19 Robinson points? I’m not saying that your points are any more or less worthy that theirs – just that you probably have totally different palates ,not to mention favourite wine styles (or countries/regions).

    I’m no Bordeaux fan, so a 100-point wine with the greatest, most lip-smackingly enticing tasting note would probabably not make me shell out for the wine – unless, of course I had tasted it myself and liked it.

    On the other hand, a 95 point Pinot may (and I stress MAY) be of more interest to me, although the points are still pretty superficial, as far as I’m concerned. The tasting note is all-important, to thecomplete exclusion of the points. For instance, you refer to one of the Seresin wines (a 93-pointer) as “Quite structured with some chocolatey richness and olive notes. Perhaps even a bit Syrah-like?” Now I may often find a fine old nortern Rhone Syrah to be quite “Burgundian” in style (or even vice-versa) but chocolate and olives in a Pinot tasting note would not make me want to rush out and buy. I want Pinot to taste of Pinot (i.e. myriad fruity and secondary/tertiary things, none of which would normally include chocolate and olives).

    I could go on (and I know I do, sometimes) but, in short, I say down with points – and up with tasting notes. Of course, it wouldn’t be much to the advantage of those cash rich, time poor businessmen who buy their wines based solely on points. Then again, who cares?

  2. Jamie, are scores on the 100 point scale relative or absolute? What I mean is, should a consumer expect a £10 wine that has been awarded 90 to be of similar quality to a £30 wine that has been awarded the same score?

    Enjoying the updates, as ever.

  3. Leon, I think the elevated scores round here reflect the fact that I’m being more selective, as I taste more, and only putting really good stuff up. The Seresin Pinots earned their points – they were superb – truly world class.

    Matthew, it’s absolute, otherwise it would be a total disaster zone. In terms of absolute quality, you’d expect a £10 90 pt wine to be on a par with a £30 90 pt wine.

  4. I think Imbibe have done a good job of putting /100 scores back where they should be eg scoring 60 upwards for drinkable wines.

  5. First off I sell wine here in Oregon. I tell my customers that if they find a critic, publication or wine steward that matches their tastes, then use it. If possible, be sure to taste it themselves. I would also like to see a list of wines tasted but not enjoyed. I realize, especially in print, space is tight. I get to taste about 80-100 wines a week as reps bring them by to taste. It is always great fun to taste something that really stands above the crowd. It has been very interesting being here close to many of the Oregon wineries and also having abouot 35 New Zealand producers on my shelves. So far the only Seresin Pinot available here is Leah. I have been in my store for about 20 yeats. It is a grocery chain store that I get to do the buying for my store. I love it. I also bought Jamie’s book on The Science of Wine. It certainly makes for interesting reading.

  6. you know that the California wines I like are generally scored in the 85-88 point by James Labube. If he rates the wine higher than 90 I know its too big and over-extracted for my taste. Wines I call elegant, he describes as “slightly thin” . So i just go by my own private Idaho when it comes to this stuff.
    Another very funny example of “Ratings gone wild” and how i don’t believe any of them.
    The Spectator rated the 2007 Vieux Telegraph Chateauneuf de Pape 95 points, and decanter rated is as one of the worst of the whole vintage… so the question is who’s rating inflating are we talking about?

  7. After playing around with lots of different scoring systems for my own notes, I settled on a simple five-point scale:
    1=rubbish
    2=poor
    3=OK
    4=good
    5=great

    I would only ever buy a wine again if it scores a 4 or a 5. The problem with any other scale (even a 10 point one) is that everything gets compressed and you never use most of the available points.

  8. “…Some brave souls persist with other systems…”

    I think your use of the term ‘brave souls’ implies the 20-point users and those using other systems are somewhat out on a limb. That’s hardly the case…we have in that corner Jancis Robinson, also anyone who scores for a Decanter tasting (at least I believe that to be the case…the Decanter Awards are certainly scored out of 20) and perhaps the best quality printed magazine currently available, The World of Fine Wine, also scores out of 20. In fact, you write for that publication don’t you Jamie? Do you consider Neil Beckett and his tasting panels – including the likes of Tom Stevenson, Michael Edwards, Stephen Brook, Michael Schuster (just looking at Issue 26) and so on – to be “brave souls”?

    A point system is a method of succinctly expressing an opinion, good or bad, but says nothing else about the wine – for that you need a tasting note. Shifting sheep-like towards the 100-point system implies the opposite, that only the final score matters, and it also implies that the score is ‘transferable’ in some way, suggesting that we must all use the same system because one person’s 96-point wine is the same as the next person’s 96-pointer. But that’s plaintly not true. Reading The World of Fine Wine panel tastings shows that even experienced tasters differ in their scoring of a wine, depending on their style preferences, their likes and dislikes. With this in mind I always find it amusing when other wine critics comment along the lines of “…tasted 200 wines from Bordeaux with X and Y today, our scores were remarkably similar, shows what great tasters we are….”. I can hear the mutual back-slapping all the way up here in Edinburgh.

    In addition, regarding the ‘default’ 100-point system, I do think that anyone who under-estimates their readers’ ability to understand a 5-star, 20-point or any other system just as well as a 100-point system is very much wide of the mark when judging the intelligence of their readers. I’m all for individuality of thought and independence of opinion, both with regard to the opinion of the wine and how that opinion is expressed/summated.

    On the issue of grade inflation, I am in full agreement. I have long joked that the 100-point system is now only an eight-point system, these points being (1) 89 and below (the failures), (2-7) 90-95 points and (8) 96 and above (unaffordable wines). In that respect it is certainly no better than a 20-point system scoring from 11 to 20 (as Clive Coates did) which with half-points gave 19 point levels.

  9. Jamie, I tried to “re-invent” and extend the 100 point scale in my book “Washington Wines & Wineries” (UC Press). I made a strong case for rating wineries, not wines, and expanded those ratings to begin at 50. Only one winery was rated 100 points, only a handful rated 90+. I found that readers liked the system, the critics were kind, and the wineries hated it. Although only 25% of the state’s 500 (at the time) wineries scored 50 or better – meaning 50 was a decent score, 60 a good score, 70 excellent and 80 really stellar – the feedback was that wineries could not explain why they “only” got a 75 to a potential tasting room customer. So most wineries refused to carry the book in their tasting rooms which is where most sales occur. Needless to say, it cost me. I will not use the system again, but I don’t regret giving it the old college try.

  10. I sometimes find those scores useful. While in a store perusing the Piedmont selection (for example) of unknown producers to me for a last minute purchase, I will pick the 88pts which is usually a lot more balanced and drinkable than the 92’s and above. I have learned to adjust to those numbers.

  11. I recently came up with a wine rating system on my wine blog. The top end of the scale is “I keep on drinking it”. I have found it considerably more useful than most rating systems I have read by wine writers who earn a heck of a lot more money rating wine than I do.
    Alternately I revert to the coliseum method of rating wines – thumbs up or thumbs down. This too is surprisingly effective.

Leave a Reply

Back To Top