A lot of wine critics rate all of the wines they taste out
of 100.
This is obviously a bit daft. Rating them out of 10 or 20,
maybe - or it could be useful when tasting a lot of wines at once - but being
able to taste one wine in isolation and scoring it out of 100? Nah, it's daft.
If you're lucky enough to have an incredible wine experience
- the kind of thing where the world around you lights up as you take a sip, the
perfume and texture of the wine being absorbed into your every sinew, a feeling
of poignancy emerging up your body as the wine slides down - this kind of
experience is beyond the realms of a number. It'd be disrespectful to assign it
a rating. It'd be like rating your friends or loved ones. Or your favourite
places in the world, or your most memorable life experiences. The most powerful
things in life, the most moving, the most unique to you, the most sacred. Can
you rate them out of 100?
Robert Parker and the anti-flavor elite
American critic Robert Parker is the
most successful at rating wines out of 100. His scores can dictate the selling
price of fancy wines. But a lot of British writers in particular don't seem to
be his biggest fans. They criticise him for liking a style of wine they don't
like (he favours rich, full wines with relatively sweet fruit; they don't).
While he amusingly refers to them as "the
anti-flavor wine elites".
Wine scores: for consumers, or for critics themselves?
I may be wrong but I get the impression with some British
wine writers their heart isn't in it when they award 100-point scores, but they
do it anyway. Because to not award scores would be to miss out on the
opportunity to share in Parker's success. Critics rightly make a big play of
having the consumer's welfare at heart, but really - understandably - they're
out for themselves too. They're not one-person charities who've decided to give
up their lives and livelihoods to help consumers find good wines like vinous
Mother Teresas. They're earning a living. (Evidence of this can be seen in the
countless articles and debates about the decline of wine columns and wine
writers' salaries.) They're trying to help consumers find good wines, which is
great, while also trying to make a good living out of it themselves. Where that
balance of priorities lies will of course be different for all critics. But if
I was reluctantly rating wines out of 100 because I felt like it was the done
thing, but had little confidence it made sufficient sense for consumers, I'd
stop doing it.
Can the consumer who's mildly interested in wine reliably
compare the ratings of different critics? And can they multiply by five a
critic's rating out of 20 to create an equivalent 100-point score? No, and no.
Sine Qua Non wine: very poor or perfect?
Sunday Express wine writer Jamie Goode has written an
interesting blogpost
about a wine called Sine Qua Non Shot in the Dark, which he believes is 86%
good, whereas Parker thinks it's a 100% perfect wine.
Now if you're an average consumer with no more than a
passing interest in wine, you might reasonably assume 86 means it's still a
pretty good buy, even if it's not perfect (a score of 70 at a British
university is typically a first class honours). But in his post, Jamie says
"this is actually a poor wine". So 86 means poor - yet I know that wines
scoring 89 and upwards are very good. So a lot of ground must be covered in 87
and 88.
Jamie Goode's post raises some good questions about
objectivity and subjectivity. And it got me thinking: what proportion of a
critic's 100-point score relates to the objective quality of the wine,
and what proportion is about how much the critic enjoyed the wine?
Because you might accept Shakespeare was a genius, but hate his plays. In which case, what score for Romeo & Juliet?
Does 86 really equate to poor? I thought 86 in this case was poor in the main, because it was a £475 bottle?
ReplyDeleteI used to be a member of the wine gang. I approached Tom Cannavan about their 100 point scoring system, as I felt it was terribly confusing for the consumer. They medal their wines gold, silver, bronze - bronze starting at 85. The view here being that if you pay £5 for a bronze, it's a decent enough wine - but if you pay £50 for a bronze - not so. Their medals go up in increments of five or so from 85.
The striking thing he told me is that the 100 points score is really a 30 point score. If there is a technical fault with the wine, it doesn't score. If there is no technical fault with the wine, you then start at 70 and work your way up. They've had a few sub 85 point wines on their site that they've suggested would be good at knocking back at a party or early lunch, but nothing you'd want to cellar.
It seems strange that critics are really only scoring out of 15 and even though they regularly do it blind - once they know what the wine is, the price can then affect their judgement of it.
Surely this also allows wine makers and supermarkets to pass off "average" wines by just putting the numbers on - like tailoring quotes on film adverts. How many people buy wines on medals - how many people assume 85 is a great score? If you were to compare it to your exam history, 85 out of 100 is the stuff of braniacs - not average results, that could be good depending on the price you pay.
Thanks for the comment Chris.
ReplyDeleteI agree, the fact scores seem to come only within that really narrow band does seem odd.
Like I suggest in the post, I reckon scoring wines out of 100 when you're tasting say 60 within a couple of hours or so is understandable, but I just think scores aren't necessarily very helpful for the majority of consumers, who don't realise you can't compare scores from different critics like-for-like.