One thing which struck me when studying wine sensory evaluation last year was that to do a tasting properly, surely replicated judgings should be employed. Well, no large competition has yet found itself sufficiently resourced to do this. But, now that many 2012 wine award results are out, we can decide which wines are best, right? Well, no. Awards can be inconsistent between competitions. Formal studies having been written which conclude that luck plays a large part. However, as a bit of fun I thought I’d try to look through the 2012 results for English Wines and see if any patterns emerge, and also look at just how inconsistent results are. But simply slamming the results together and stuffing them through a stats package is not so simple. Different producers enter different competitions, so the data is very sparse. And how is one to judge, for example, an IWSC ‘silver outstanding’ against a Decanter ‘silver’ or a UKVA ‘gold’? And should all competitions be judged equal, or are some more equal than others? For example, I did not include UKVA trophies in the list, but just categorised them as golds. There’s no right answer here.
Well, I had a stab at coming up with a basic model to include some of the relevant factors, and I’ve written up a full list of all the consolidated results for 2012. It’s important to note that I do not have access to data on those wines which were entered but did not receive any award or commendation: so where there’s blank it probably means the wine was not entered, but it might mean it didn’t cut it.
Most impressive for me on the list are the Camel Valley Rosé sparkling 2010, Furleigh Bacchus Fumé 2010, Furleigh Classic Cuveé 2009 and the Denbies Late Harvest 2011 which all have high medals in multiple competitions. With Furleigh wines also getting top UKVA honours for a further 2 wines, you’ve got to think that Ian Edwards is doing something right!
I’d heard it said that the number of medals given by the UKVA is excessively rising and out of proportion to international competition. The graph below shows the facts. The percentage of wines with gold medals is above the long term average (but so is quality, in my opinion). Perhaps more questionable is the high percentage of silvers for the past few years? There’s no right answer to this, and it’s clear that results from one competition should only be compared with other results from the same competition. UKVA awards through time (% of wines per category):
It’s interesting to see some numbers too, comparing the awards in 2012 to the longer term average (2003-2012 inclusive):
(And yes all the analysis includes Welsh wines too – sorry Jac)