jamie goode's wine blog: Are wine competitions reliable? Data from the USA

Tuesday, September 01, 2009

Are wine competitions reliable? Data from the USA

An interesting paper from The Journal of Wine Economics has analysed the reliability of judging at wine competitions in the USA. You can download it for free here.

From the abstract:
An analysis of over 4000 wines entered in 13 U.S. wine competitions shows little concordance among the venues in awarding Gold medals. Of the 2,440 wines entered in more than three competitions, 47 percent received Gold medals, but 84 percent of these same wines also received no award in another competition.
There's some good analysis of data in the paper. The conclusions are extremely discouraging:
(1) There is almost no consensus among the 13 wine competitions regarding wine quality, (2) for wines receiving a Gold medal in one or more competitions, it is very likely that the same wine received no award at another, (3) the likelihood of receiving a Gold medal can be statistically explained by chance alone.
We can conclude that there's something fundamentally wrong with these US competitions. Is it a US-specific problem, or does it also apply to competitions in other countries? Is it that the judges used simply aren't good enough tasters in this sort of environment?

[The same author has also published a paper looking at judge reliability at one of these US competitions.]

Labels:

17 Comments:

At 6:42 PM, Blogger Nerval said...

As a former judge in several European competitions I can say that as a taster, I have found them extremely disappointing.
Why?
1. Too many wines are tasted in a series, making the process very tedious.
2. 'Flights' of wines are too repetitive and after 20 1-year-old unoaked Chardonnays they all taste the same.
3. Not enough time for judgment (see point 1), sessions are excessively packed, there is typically 45-60s for assessment at the end of the session, precisely when wines are becoming too similar perceptively.
4. Most wine competitions use the OIV competition sheet which is utter nonsense, the final score is a sum of elements such as 'limpidity' and 'frankness' of the nose.
5. Juries leave much to be desired, composed of people who meet for the first time, often mixing oenologists (mostly interested in technical correctness), vintners, journalists (often interested in personality at the expense of correctness). I have sat in juries with people who have no professional tasting experience whatsoever (members of wine clubs or similar); on some more provincial competitions it's often an 'anybody-can-judge' policy to boost the international attendance.
6. At many competitions there's no discussion whatsoever among jurors, resulting in wines that are scored 65 by one juror and 95 by another.
7. Wines are served in ISO glasses making assessment of bouquet very difficult (and also influencing the performance on the palate).
8. At many competitions in Europe, the average level of wines presented for assessment is really low: I've sat in the Best of Riesling, arguably one of the leading events, and out of 200 Rieslings tasted there were 3 producers of any notoriety, the rest was anonymous plonk. Perhaps New World competitions are an exception to this, as is the IWC.
9. Nobody will agree but blind tasting (an iron rule at all competitions) is pointless. Often at a contest you have a series of wines coming from France, Austria, Moldavia, Thailand and Australia, made from the same grape and similarly priced. The French wine wins statistically by 0.5 points. What's the use of such a verdict?

My conviction is that competitions are useless to the educated consumer. The recommendation of a magazine or single taster is much more valuable in that wines are tasted more in-depth by individuals whose preferences are made clear to the reader. A good wine writer puts wine in a context. A competition makes all wines taste the same.

 
At 6:57 PM, Blogger Jamie said...

Nerval, thanks for that insightful comment - I have to say that I agree with you.

 
At 7:58 PM, Anonymous Anonymous said...

I've judged in New Zealand and Australia, as well as in the U.S. I'm not at all surprised by the conclusions of the study, though I am a bit surprised that anyone takes U.S. competitions seriously. The Australian show system works quite well, and the New Zealand shows I've judged are equally well run with excellent and well-trained judges (okay, maybe not counting me).

 
At 8:55 PM, Blogger Andrew Halliwell said...

I have to say that I'm pretty skeptical about wine competitions in general and Nerval's comments and the research mentioned by Jamie more or less confirm my view.

I've come round to thinking that if you enter enough competitions with the same wine, assuming it's at least ok, you're bound to win something. But what does a bottle proudly displaying the only bronze it has managed to scrape really mean?

 
At 9:05 PM, Anonymous Anonymous said...

I would never use the fact a wine has a medal when making my choice. I prefer to trust the wine bloggers/reviewers who I have learnt over time have similar similar tastes as me.
I don't know how many wines are tasted at the various awards or by whom.

 
At 2:15 AM, Blogger Claude Vaillancourt said...

How do we know that a judge or a wine critic is a good enough taster. How many known wine critics or writers would agree to pass a test to assess how good they are at tasting wine? I work in a laboratory, and on a regular basis we need to verify with standards and controls the accuracy and precision of analytical instruments. It is the only way to trust the results. The problem is that smell and taste are very unreliable senses and wine is a variable product in time and from one bottle to another.

 
At 8:40 AM, Anonymous Charles Metcalfe said...

As Co-Chairman of the IWC (International Wine Challenge), I agree with some points made and disagree with others.
At the IWC, panels discuss everything before awarding scores (like the Australians and New Zealanders, unlike the OIV system, which is purely arithmetical).
We use glasses (as it happens, from Schott-Zwiesel) that are brilliant for assessing wines (and very robust!).
We have 2 rounds of tasting. In the first 5 days, we taste all the wines, and sort them into 3 categories: wines worthy of a medal, wines that are commendable, and wines that get nothing. In the last 4 days, we retaste the wines deemed worthy of medals in the first round, and decide what they get.
We ask our tasters to taste about 100 wines a day, and never give them a flight of more than 15 wines. Over the 26 years tha IWC has existed, we have found that mixing flights of different types over the day keeps the tasters engaged, and makes it easier to concentrate.
We have 20 panel chairs who judge for the 9 days of the IWC. They are chosen for their tasting ability, and ability to arrive at a score for each wine by discussion, not by bullying their panels.
The 4 permanent Co-Chairs plus each year's Guest International Co-Chair check all the scores. In the first round we taste only the wines that are Commended or thrown out, to see if any have been unfairly scored. In the second round, we retaste everything. If we want to change any score, we have to seek approval from another Co-Chair.
This means any wines that wins an IWC medal has been tasted at least 3 times, and sometimes as many as 6. The Trophy-winners come from among the Golds awarded, and are judged on a separate, 9th day by panels made up only of Co-Chairs and panel leaders. As far as I know, no other wine competition in the world is as thorough as this in the judging process.
We tell the tasters everything about the wine except who made it and the price it is sold for. Country, region, sub-region, grape varieties, vintage, sugar levels are all revealed. This is the best way to give our tasters a chance to judge the wine according to what it should taste like.
We choose our tasters carefully, and rank them as full Judges and Associates according to our experience of how good they are. Some tasters we do not re-invite because we don't consider them good enough.
Throughout the history of the IWC, we have welcomed suggestions of how we could improve our process. It's by listening to these, and changing things when someone comes up with a good new idea, that we believe we have evolved to be the best, most thorough and fairest wine competition in the world.

 
At 8:42 AM, Anonymous Charles Metcalfe said...

Sorry, I should have said that the Trophies are awarded on a separate, 10th day (Just as well our system is NOT arithmetical!)

 
At 9:01 AM, Blogger Jamie said...

Thank you Charles, for that useful perspective. I should add that I am one of the panel chairs at the IWC that Charles refers to, and I agree with him that the IWC is the most rigorously judged competition out there. I have some confidence that the research paper's findings would not apply to the IWC.

 
At 10:11 AM, Anonymous andyincayman said...

As an "expert consumer" (think Jamie would call me a hobbiest) rather than wine professional or general consumer I personally don't take much notice of wine awards.

My main issue is that I find the wines that win uninteresting (I cannot say I have tasted them all - if I had i would have to be very rich or an alcoholic!).

This is probably due to the averaging process that happens with panels and multiple tasting (even with sophisticated processes), but also partly due to interesting wines not being entered at the fear of being marked down.

Where they are useful however is as a learning tool, award winners are very likely to be good examples of a specific wine region/grape variety, so serve as a good introduction.

 
At 12:31 PM, Anonymous Andrew said...

Competitions are only as good as the entrants and there is no competition in the world that has the world's best wines as entrants. Thus all wine competitions everywhere are a 'best of the rest'

 
At 1:15 PM, Anonymous Alex Lake said...

I've nothing much to add, but would just like to say what great set of comments.

One thing that would be interesting in a context like IWC, would be to enter the same wine 2 or 3 times to see if it gets a consistent ranking. Maybe they do that already...

 
At 1:20 PM, Anonymous Anonymous said...

Interesting that Jamie agrees with Nerval's remarks criticising European wine competitions, then suddenly remembers he's heavily involved in the IWC, reads Charles Metcalfe's comment, then backtracks, and says that the IWC is different.

Fancy that!

 
At 3:05 AM, Blogger BeyondTheMargin said...

The plethora of gold medals that adorn relatively common wines demonstrates that wine appreciation is subjective and largely determined by individual preferences and ability to discern the subtleties of each wine. While I don't claim to be a professional, I do find some "gold medal" wines to be quite ordinary and question the legitimacy.

http://www.beyondthemargin.net/2009/09/santa-barbara-county-vineyards.html

 
At 3:48 AM, Blogger kevin said...

Alex asked about the same wine being entered several times to check consistency. I know of several occasions where the same wine has been entered in a show under different labels only for one to get gold and the other no award. I know of one instance where one got the trophy and the other no award.

We are all familar with so-called "show wines", wines that stand out in a line up but aren't very pleasant to drink.

There are all sorts of problems such as serving temperatures, breathing time etc that favour some wines and disadvantage others.

If a wine doesn't have an immediately obvious nose it is very disadvantaged as it really can't make up those points on a wine that has obvious aromatics.

One factor that has always concerned me is those wines with limited length of palate that don't get found out by the swirl and spit approach.

The comment often heard is that the best wines don't enter. Perhaps they do, perhaps it is just the best known wines that don't enter.

 
At 4:16 AM, Blogger ned said...

Wine competition is on the surface, a totally comprehensible concept, the only problem is that is ACTUALLY ridiculous.
Nerval's points, even if only half true, are damning enough.
All these various things, competition, scoring, etc, designed to aid the consumer originally, are now co-opted by the trade and can no longer be trusted by consumers as being in their interest first.

 
At 9:07 AM, Anonymous Anonymous said...

An IWC sticker would for me be a negative point for a wine rather than a positive.

 

Post a Comment

Links to this post:

Create a Link

<< Home