Blind Taste Tests and Rational Thinking

A few years ago I received a copy of the book The Wine Trials, which used blind taste tests to find that there is no correlation between the price and perceived quality of a bottle of wine and how experienced wine drinkers judged its taste. And study after study shows that those results differ dramatically from how people judge taste when they know the identity of the wines they’re drinking. I’m going to argue here that we should approach the evaluation of claims and arguments the same way.

When people know that the wine they are drinking is expensive and from a prestigious winery, they report enjoying the wine much more than they do if they taste it without knowing where it came from or how much it cost. And the same is true of cheap, non-prestigious wines; they are much more likely to report liking those wines if they don’t know that they’re cheap and from a winery they’ve never heard of. In other words, we evaluate these things much differently if we have a predetermined expectation.

I would submit that we do much the same thing when evaluating ideas, claims and arguments. If we hear those ideas expressed by someone we have already determined that we agree with, we are much more likely to agree with them without actually thinking about it. Conversely, if we hear those ideas expressed by someone we disagree with, we are much more likely to reject them out of hand, without giving them any due consideration. This is why what I often call the argumentum ad labelum is so common — it’s a means of dismissing a claim or argument rather than engaging it.

I see this kind of thing all the time (and sometimes even in myself, much to my own frustration). If we are atheists, we will often go along with a really bad argument made by another atheist even if the flaws in that argument should be obvious. Religious people do the same thing, of course, and probably more often. This is one way that confirmation bias operates, as well as a handful of other forms of thinking that undermine our rationality. And usually we are very good at spotting this sort of thing when done by someone else but blissfully unaware of it when we do it.

But to be truly rational, we should evaluate each and every claim and argument the same way, by asking logical questions: Is this claim supported by the evidence? Is it logically consistent from premise to conclusion? Are there alternative explanations that are more parsimonious? Is there evidence to the contrary that is being ignored?

This is why law professors love to pose hypothetical questions, because they help uncover inconsistencies in the way we think about situations. They help us to avoid special pleading. If I wrote a story about an atheist group that is told they cannot protest near a Muslim festival, you may well have one reaction to it; if I told you instead of an identical situation with a Christian group being told the same thing, you may well have a very different reaction. And again, I’m not just pointing fingers at others here — I do this too, as much as I try to avoid it.

There are people who are really good at avoiding this sort of thing and people who are very bad at it, with most of us probably falling somewhere in between. For me, my model in this regard is my friend Jeremy Beahan, who co-hosted my radio show for two years and is one of the hosts of the Reasonable Doubts podcast. He is probably the most consistently rational person I know and he has caught me engaging in the behaviors above many times. But I have so much respect for his ability to think critically even about issues that he feels strongly about that I don’t react defensively; when he catches me at it, I take that seriously and try to rethink my position.

We all like to think that we are supremely rational creatures, and we often convince ourselves of that by pointing to others who are thinking quite irrationally (as I do every day on this blog). But being relatively more rational than someone who is highly irrational does not mean that we are avoiding the kinds of easy, tribalistic thinking noted above that undermine our ability to think critically, especially on issues that we are passionate about. None of us can avoid them entirely, I imagine, but as rationalists and skeptics we should try our best to cultivate habits of thinking that help us avoid them as much as possible. And when someone points them out to us, we should react reasonably rather than defensively.

"Take your head out of your ass. I'm not in the mood right now to ..."

Scalia, the Not So Faint-Hearted Originalist
"Shouldn't that be:LMHO! Nice title. ;)"

Wiles Makes a Testable Prediction: Trump ..."
"You said "a gun, ammunition, and a laundry list of military equipment". If you take ..."

Scalia, the Not So Faint-Hearted Originalist

Browse Our Archives

Follow Us!

What Are Your Thoughts?leave a comment
  • brucegee1962

    The hypothetical test case should be a habit that everyone gets into, particularly with emotionally-charged issue. I was reminded of that with the Trayvon Martin case. Before you can be sure about your own reaction, you need to run through the same scenario with the races reversed to see if you would feel the same way. The same with any news story that gets you riled up. Having a good imagination is, I think, one of the key elements needed for developing a good moral sense.

  • Knowing that we’re imperfect helps us guard against inconsistencies we otherwise wouldn’t acknowledge.

  • Kevin Dugan

    I recently read Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris and Elliot Aronson (Mar 2008).

    The book spoke largely to this point; how most of our reasoning is motivated by self justification. I found it a very challenging read, not that it was in any way difficult, but that it opened my eyes to a whole world of internal and social interaction I’d been unaware of. Arguments that come from ‘enemy’ sources or challenge our beliefs are naturally discounted and quickly forgotten. Those we here from ‘friendly’ sources or that confirm our existing beliefs we tend to accept uncritically. It also talks about how we actually re-write our memories to produce a more consistant narritive in which we play the besieged hero(ine).

    It has been especially useful in this campaign season, with my brother and mother firmly Foxzombies. I’ve worked extra hard to equip myself with hard data on the candidates and issues, rather than simply spouting positional rhetoric. It’s also helped me to point out that candidates are VERY good at recreating their memories of the past to justify changing positions.

  • Say what you will about the many, many flaws of biomedical research and the peer-review publication process, one of the positives is that reviewers of submitted papers are blinded to the authors.

    Of course, in many cases they’re not really blind. A long-awaited, much-discussed Phase 3 clinical trial report is already an open book. And if you name a drug in development in a certain field, I can predict with a fairly high degree of certainty who is going to be one of the lead authors of a paper.

    But by taking the names off the paper, it drops one barrier to solid critique.

    Of course, anonymous posting would be analogous…except people’s avatars become known, and those whose opinions have been respected before get … well … respect.

    Sastra over skdlfish@#ASskdla_2#$google.

  • sheila

    Sadly, the people who most need to do this are the very ones least likely to do it. A bit like fact-checking people you agree with every now and then.

    I make effort because I’m determined not to wind up like the tea party, screaming that everyone else is closed-minded and never, ever checking facts or logic.

  • anandine

    While we’re on the subject of fooling ourselves, memory turns out to be very malleable. Each time you remember something, you reconsolidate the memory in your brain, and the next time you remember it, you pull up the reconsolidated version, but it may not have been reconsolidated exactly as it had been before. If you remember an event involving a bare-headed person, and someone asks what color his hat was, the next time you remember the event, you may remember the person wearing a green hat.

    This is good for therapists, who can more or less effectively direct memory reconsolidation to ameliorate traumatic memories, but it is bad for anyone who wants to trust his own memory or anyone else’s. It means you can never — never — really trust your memory. One might say, “If I didn’t know as much as I do about the malleability of memory, I would be sure that X happened, but instead I can only say I have a clear memory of it happening.”

    It may also explain some of why people confess to crimes they didn’t commit after lengthy interrogation. Stephanie Crowe’s brother said that eventually he began to think he had done it after being questioned all night about her murder.

  • F

    Too often, the very argument gives away the “label”, so this may be difficult in some cases.

  • This is why the argument from authority and ad hominem arguments are mirror-opposite fallacies. They both appeal to the source of the argument rather than the argument itself. The argument from authority assumes that the prestige of the originator justifies the argument, while ad homimen assumes that the depravity of the originator damns the argument.

    But yes, it’s very, very difficult to establish a policy of ignoring originators altogether, and probably not very advisable given that it would require us to (for example) give each World Nut Daily article the benefit of the doubt and carefully analyze them individually before suggesting that they might be bullshit. The better stance to take would be to assume that such articles are bullshit based on prior track record, but also be prepared to substantiate that claim if you’re actually going to make it.

    Our standard for arguments from authority should be even higher, I’d think, given how easy it is to conclude that good, rational, intelligent people are also right in whatever they say. On the contrary, their being good makes us less willing to judge them, and their being rational and intelligent can simply make them better at making a bullshit argument seem sensible to themselves and others.

  • Jordan Genso

    I agree with Ed, but I have to admit that I didn’t read the OP.

  • Michael Heath

    If one doesn’t have any formal training in critical thinking and rhetoric, than they’re much less able and inclined to objectively analyze what someone writes and says purely from a quality standpoint, independent of the observed advocates’ conclusions. They simply don’t have the developed ability to consciously and dispassionately parse the quality of the argument from its conclusions. I rarely observe liberals criticizing their allies’ bad arguments or worse yet, when it comes from a liberal but isn’t necessarily a liberal-friendly argument.

    What we do observe is motivated reasoning; we effectively seek out flaws in our opponents’ arguments while failing to employ the very same standard to those of our allies.This is a primary reason I relentlessly advocate we teach critical thinking as a stand-alone curricula topic from K-PhD. Because I think the more cognizant we become of the quality of an argument independent of its conclusions or its source, the more inclined we are to intervene in bad arguments from our own tribe. Arguments which frequently use the very same remedial rhetorical and logical fallacies we see from conservatives.

  • It would be awesome if there was a service that could take a decision and break it down into simpler, yes/no questions that didn’t let us know which ultimate answer we were heading toward, such as which president to vote for.

  • so you are saying that my atheist card doesn’t give me infinite knowledge in all that is logic and reason? Fuck, that is was I signed up for. I feel cheated…

  • davidworthington

    Back in the 1960s Robert Newman wrote a book called “Evidence” I forget the subtitle. He doesn’t reject source credibility, instead he provides mechanisms for testing the source. One I remember is predictive record. How well somebody has predicted policy implications in the past is a marker of how reliable they will be in the future because it suggests the care, willingness to qualify, and attention to nuance of a source.

    I doubt its still in print, but it was a good book for learning how to evaluate sources. As I recall, he wrote it in response to the red scare 1950s

  • eric

    This is why law professors love to pose hypothetical questions, because they help uncover inconsistencies in the way we think about situations.

    I think this is also why higher education generally correlates with less religiosity. Knowing you will have to defend your arguments against a committee or peer group is likely to make you think about all the flaws your argument might have – if only so you can be ready to addres them when your peer group brings them up. When you do this thought-exercise over and over, it can become second nature to try and play devil’s advocate against yourself. You start ‘red teaming’ the beliefs you might hold outside of your academic field…and voila.

    Of course there are lots of exceptions. Academics who never learn how to really successfully red team their own arguments – who only consider superficial criticisms or who never grok when a response they may personally consider adequate will be unsatisfactory to their peers. There’s also the academics who maintain high conceptual walls, and just never apply these techniques to their other beliefs.

    But, in general, practicing critical thought is going to make you better at doing it, and practicing it on one subject is going to result in it bleeding over into the rest of your life. Which is exactly the case with posing hypothetical questions. The more practice you have at it, the better you’re going to get at forming hypotheticals that are analogous and useful to reality. And once you learn how to do that properly in one specific subject area, its likely going to bleed over into the rest of your life.

  • Someone needs to open a wine bar in which the whole idea is to lie about the cost of all the wine. The menu is printed with florid-sounding titles and reviews and is, in fact, the most expensive thing in the place. Everything else is good cheap wine and solid cheap food.

    But for a mere $400 a head you can enjoy taking your date out for a rare, dusty, bottle of Chateau-Neuf De Rod Laver or Nuys Saint-Woggawogga to accompany your 3-star texas chili made with real truffles and hand-squeezed kobe beef.

    A week after you pay the whopping bill, you get a discrete refund mailed to you. Your date never needs to know.

  • I think you have just explained religion.

    Religion is a very expensive belief. It is one that people have spent countless hours (or, really, years) on and much money (their tithing). Therefore, like the expensive wines, they value more highly because of what it has cost them. They don’t want to believe that their “investment” in religion was wasted.

  • valhar2000

    This is the reason I dropped out of the “Atheist Movement”. It was great when they were criticizing people I already did not like, but eventually things stopped being so rosy. And, more worryingly, I started to notice the traits that I did not like about the movement in myself. I can only hope I’ll succeed in purging them.

  • =8)-DX

    Very spot-on, bravo! I often find myself in this position. One thing I think is worse is that often I think rationally but am unable to follow up the result of the rational thought-process with actual behaviour.

    As humans, emotion is our prime enemy, while at the same time making us empathetic touchy-feely assholes we should be proud of.

  • It was great when they were criticizing people I already did not like, but eventually things stopped being so rosy

    Hey, you should try Fox News. They specialize in preaching to the choir there, if that’s all you want to hear.

  • octopod

    Sometimes knowing the source actually gives you more information, though.

    (1) One does not explain one’s reasoning all the way back to its antecedents. People who are reasoning from completely different underlying priors can come to the same conclusion, and whether you believe them to be correct in that case depends on whether you agree with their unspoken priors — i.e. on the source, not the argument.

    (2) Credibility is built up over time.

  • Oh, knowing the source gives you plenty of useful information. It just can’t give you justification for any argument’s validity or lack thereof.

  • Marcus re: restaurant idea.

    FUND IT. 🙂

  • baal

    Scientists don’t always get it right but (at least in bench science that I was a part of) you get social points for showing that a person overlooked or screwed up in some way. This leads scientists to always be looking for ways to back up their assertions and for keeping their assertions limited to available evidence.

    I still do this to some degree and have had coworkers express annoyance that my ‘official statements’ are always narrowly drawn. They usually want a more holistic statement.

    Also, there is a huge difference between the validity of an argument you recite in your head vs actually saying it out loud in full sentences. While you’d think the two would be the same, the later is a much harder test to meet.

  • Michael Heath

    Gretchen writes:

    knowing the source gives you plenty of useful information. It just can’t give you justification for any argument’s validity or lack thereof.

    Sure it does. We rely on the process deployed which publishes conclusions all the time. It’s one thing for some guy on the Internet to claim in a comment post to an article that the world isn’t getting warmer*, quite another for Nature to publish an article asserting the same if it came a from a meta-study or group paper of all science groups who track global temperature trends.

    *I was on the Volokh Conspiracy the other day to get their take on Michael Mann’s defamation suit against the National Review and AEI. The commenters in Adler’s take are convinced Mann’s findings have been disproved.

  • DiscordianStooge

    I’ve found I’m much quicker to question information from sources I actively distrust (e.g. conservative talk shows and the Dail Mail), even if it seems legit up front, than I am to blindly trust a source I like.

    And while I’m sure I let many things slide if I agree with or like the source, I get way more pissed when I see bad arguments or falsehoods from “my side,” because I expect it from the “other guys,” but if people are making bad arguments about things I support, it makes my job of convincing others harder.