A few years ago I received a copy of the book The Wine Trials, which used blind taste tests to find that there is no correlation between the price and perceived quality of a bottle of wine and how experienced wine drinkers judged its taste. And study after study shows that those results differ dramatically from how people judge taste when they know the identity of the wines they’re drinking. I’m going to argue here that we should approach the evaluation of claims and arguments the same way.
When people know that the wine they are drinking is expensive and from a prestigious winery, they report enjoying the wine much more than they do if they taste it without knowing where it came from or how much it cost. And the same is true of cheap, non-prestigious wines; they are much more likely to report liking those wines if they don’t know that they’re cheap and from a winery they’ve never heard of. In other words, we evaluate these things much differently if we have a predetermined expectation.
I would submit that we do much the same thing when evaluating ideas, claims and arguments. If we hear those ideas expressed by someone we have already determined that we agree with, we are much more likely to agree with them without actually thinking about it. Conversely, if we hear those ideas expressed by someone we disagree with, we are much more likely to reject them out of hand, without giving them any due consideration. This is why what I often call the argumentum ad labelum is so common — it’s a means of dismissing a claim or argument rather than engaging it.
I see this kind of thing all the time (and sometimes even in myself, much to my own frustration). If we are atheists, we will often go along with a really bad argument made by another atheist even if the flaws in that argument should be obvious. Religious people do the same thing, of course, and probably more often. This is one way that confirmation bias operates, as well as a handful of other forms of thinking that undermine our rationality. And usually we are very good at spotting this sort of thing when done by someone else but blissfully unaware of it when we do it.
But to be truly rational, we should evaluate each and every claim and argument the same way, by asking logical questions: Is this claim supported by the evidence? Is it logically consistent from premise to conclusion? Are there alternative explanations that are more parsimonious? Is there evidence to the contrary that is being ignored?
There are people who are really good at avoiding this sort of thing and people who are very bad at it, with most of us probably falling somewhere in between. For me, my model in this regard is my friend Jeremy Beahan, who co-hosted my radio show for two years and is one of the hosts of the Reasonable Doubts podcast. He is probably the most consistently rational person I know and he has caught me engaging in the behaviors above many times. But I have so much respect for his ability to think critically even about issues that he feels strongly about that I don’t react defensively; when he catches me at it, I take that seriously and try to rethink my position.
We all like to think that we are supremely rational creatures, and we often convince ourselves of that by pointing to others who are thinking quite irrationally (as I do every day on this blog). But being relatively more rational than someone who is highly irrational does not mean that we are avoiding the kinds of easy, tribalistic thinking noted above that undermine our ability to think critically, especially on issues that we are passionate about. None of us can avoid them entirely, I imagine, but as rationalists and skeptics we should try our best to cultivate habits of thinking that help us avoid them as much as possible. And when someone points them out to us, we should react reasonably rather than defensively.