blogger web statistics/a>
Does The Wisdom Of The Crowd Provide Better Wine Reviews Than The Experts? | 1 Wine Dude

Does The Wisdom Of The Crowd Provide Better Wine Reviews Than The Experts?

Vinted on September 5, 2013 under commentary
WP Greet Box icon
HiYa! If you're new here, you may want to Sign Up to get all the latest wine coolness delivered to your virtual doorstep. I've also got short, easily-digestible mini wine reviews and some educational, entertaining wine vids. If you're looking to up your wine tasting IQ, check out my book How to Taste Like a Wine Geek: A practical guide to tasting, enjoying, and learning about the world's greatest beverage. Cheers!

[ WARNING: following is one of my lengthy diatribes. If you're the lazy and impatient busy type, skip to the summary! ]

Do you believe that fine wines are multi-faceted?

What I mean is, do fine wines change over time, present different shades and complexities of aromas and flavors?

Well… duh

If you agree that fine wines are complex beasts, then I’m about to show you why it should logically follow that wine experts may not provide the best reviews of those wines.

Because if you also happen to believe in the truth-enlightening powers of scientific and statistically relevant data, then you cannot continue to hold onto the stubborn belief that traditional wine expert opinion always offers a superior summation of a wine to that provided by aggregate reviews in outlets such as CellarTracker.com. At least, you can’t do it without being Spock-raising-a-quizzical-eyebrow-at-you illogical. By the way, if you don’t believe in those truth-enlightening powers, then I’ve got some creationist “textbooks” to sell you, but let’s not get off track, okay?

Anyway… evidence actually supports the view that individual wine expert opinion is inferior to the wisdom of crowds when it comes to reviewing wines.  It’s not that single-shot expert opinion in this field is somehow irrelevant or useless, but that it is less valuable than the opinion offered by an educated, engaged, passionate, and diverse group of people (which may or may not contain experts in their ranks).

Don’t believe me? Well, then, put down that copy of Wine Spectator for a second and hear me out. Because while the view that crowd-sourced wine reviews have merit has been called “propaganda” by wine writers as celebrated as Matt Kramer, looking less passionately and more logically at the act of reviewing suggests that it is the Kramers of the wine world who are spouting the propaganda when it comes to dismissing the wisdom of the crowd

It’s not that expert views are without merit, but The Crowd has taken it on the chin almost as much as the experts have lately. Hell, even Dan Berger, who I deeply respect – and with whom I’ve judged wine competitions this year – recently got in on that act when he semi-bashed reviews from blogs on his Napa Valley Register column. But much of that type of Crowd criticism is about to seem a lot less pointed, I think.

First, let’s look at Princeton’s Professor of Economics Burton G. Malkiel, who’s A Random Walk Down Wall Street is about as old as I am and is as close to a classic tome as one can get in the investment world canon. Malkiel summarized the findings behind the wisdom of crowds generally in that book’s later editions when he wrote the following tidbit (emphasis mine):

“In general, research shows that groups tend to make better decisions than individuals. If more information is shared, and if differing points of view are considered, informed discussion of the group improves the decision-making process.”

Note that we’re not talking about herding mentality or groupthink here, but of high-quality decision-making. Ok, so what, groups are capable of making decisions that trump those made by individuals. What does this have to do with wine reviews? Aren’t those reviews discrete, single-point decisions about a wine’s quality? Doesn’t group decision making have more to do with the board room than the wine cellar?

Not really.

Wine reviewing – fine wine reviewing, anyway – is not actually a single-shot decision, is it? Sure, we pretend that it is, with pithy tasting notes, grades, numeric scores, puffs, stars, smiley faces, and the like. But the fact of the matter is that fine wine changes; in the bottle, in the glass, in our mouths. It’s not static. What might be a 91 one minute could evolve into a 97 the next day, or the next week, or over the next fifteen years. Sure, we can make quick quality judgments, but unless someone spends several hours (or days) with a fine wine, a single data point in judging something as tinged with subjectivity as an overall quality impression arguably isn’t superior to multiple data points taken at different times, under different circumstances, and maybe when tasted by different people.

In other words, reviewing a wine implies a decision process that ideally would be based on as many different points as possible. Think of it this way: would you rather buy a wine tasted in five or ten minutes, sampled in an “artificial” situation, and then graded by an expert; or one in which many people tasted that in different circumstances, the type of circumstances that would most closely mirror your own? Would you buy an expensive item on Amazon.com based on one high-quality lab review, or dozens of reviews based on real-world usage of that pricey product?

Ben Ramalingam’s Aid on the Edge of Chaos, a blog about complexity sciences and international aid (they intersect, apparently; who knew?), deftly touched on the topic of single-point versus (for lack of a better term) multi-point decisions back in 2011, when Ramalingam talked about James Surowiecki’s now famous book Wisdom of Crowds (emphasis is mine):

“…in so-called single-shot decisions, experts are almost always more accurate than the collective across a range of conditions. However, for decisions… where individuals should be able to consider the success of previous decision outcomes… the collective’s aggregated information is almost always superior.

Still not sure this fits into wine reviewing? Consider this: what is a fine wine qualitative assessment if not a decision in which previous outcomes are important? A great wine might have been tighter and less penetrable than a walnut’s ass in its youth, but blossomed into a multi-layered beauty ten years down the line, when initially only the faintest hint of that future pleasure would have been detectible in the glass. And at some point, a previously glorious wine will be ready for the vinegar bottle, no matter how amazing it once was. Even small amounts of previous context can be important to our perception of a fine wine’s development.

Interestingly, the “almost” part in Ramalingam’s  “the collective’s aggregated information is almost always superior” is actually somewhat incorrect. Turns out, the data apparently never support the view that single opinions trump those of the crowd, particularly when it comes to predictions. And please don’t tell me that I have to explain expert wine reviews as being predictive (how many come with a proposed drinking window, again?).

Take a look at Acumen VP Terry Ketchersid’s view on the book Think Twice by Michael Mauboussin. Think Twice cites work done on crowd wisdom by Scott Page, the Leonid Hurwicz Collegiate Professor of Complex Systems Political Science, and Economics at the University of Michigan. To quote Ketchersid (emphasis is mine):

“…if you have a group of incentivized, or otherwise motivated individuals, with diverse backgrounds, the crowd will always predict more accurately than the average person. Yes that is always, not sometimes. Of interest, the collective is often better than many of the experts in a particular field. In the words of the author, “This is not good news for experts and it is deeply humbling for all decision-makers.”

 

The bottom line: Any view that purports the opinion of wine experts to be substantially better or more useful than that of the crowd (albeit the right crowd) is increasingly incorrect in light of both scientific research and public opinion.

Should wine experts (including folks like me, I should add) be concerned about this stuff?

Yeah, they probably should. After all, the track record of wine experts in adapting to obvious trends and avoiding sticking their heads firmly into the wine world sand isn’t so impressive. As I’ve written here before, the adage that “when everyone is a critic, no one is a critic” probably isn’t totally correct. The crowd not only can offer up valuable quality assessments when it comes to fine wine, but it also is capable of self-policing and of deciding who it deems as having the most value when it comes to expert opinions.

I don’t know about you, but all this has me deeply humbled as a wine reviewer. Not only that, but the predilection of younger wine drinkers to make buying decisions based on Googling CellarTracker.com’s aggregate reviews now seems much less young-whipper-snapper-upstart-shenanigans to me than it does logical, efficient-market consumer behavior.

Look out, traditional wine world; the data just aren’t totally on your side here…

Cheers!

Don't miss the good vino! Sign-up now for non-SPAMmy delivery of 1WineDude updates to your Inbox.

Email address:

    Leave a Comment

    Your email is never shared.
    Required fields are marked *




The Fine Print

This site is licensed under Creative Commons. Content may be used for non-commercial use only; no modifications allowed; attribution required in the form of a statement "originally published by 1WineDude" with a link back to the original posting.

Play nice! Code of Ethics and Privacy.

Contact: joe (at) 1winedude (dot) com

Google+

Labels

Vintage

Find