A couple months ago, I wrote a post here arguing that pure rational thinking, freed from all tradition-bound constraints, is not going to save the world – or the climate. Specifically, I claimed that Reason™ – the rah-rah, yay-for-Science!, jingoistic rallying cry of the atheism-industrial complex – shouldn’t be the torch-bearer for our hopes about the future of humanity and the planet. I stand by everything I said in that post, but I got a lot of pushback from angry readers – especially folks who thought I was arguing in favor of irrationality, precisely when we need reasonable problem-solving like we never have before. The debate was sometimes heated, but also pretty fun. So I thought I’d stir the hornet’s nest again. Ya’ll, Reason™ is not gonna save the world.
Let’s start by defining what rationality is – or isn’t. When we say a decision is “rational,” we could mean that it’s instrumentally rational: an appropriate practical decision given the goals of the person making it. Throwing a baseball to second base for a risky double play is, quite often, instrumentally rational. It’s what makes sense, given the situation and goals.
But this isn’t the definition of rationality that’s implicit in most popular discussions of science and reason. When partisans of Reason™ – such as our friends the New Atheists – opine that people need to “think rationally,” they’re actually advocating for something more like epistemic rationality: a mode of thinking that uses careful, explicit reason and discursive logic in order to produce knowledge. One implication here is that, because this sort of epistemic rationality depends on reason, all the steps in each chain of inference must be fully available to conscious analysis. Thus, a chain of reasoning is epistemically rational only when:
- Its premises are empirically or logically accounted for;
- It doesn’t contain any hidden assumptions, intuitive leaps, or implicit inferences; and
- Each step from premises to conclusion is explicit, meaning that it could be translated into public, symbolic code like math or language.
Clearly, a lot of religious thinking doesn’t meet these criteria. Religions are full of intuitive claims, inarticulate sentiments, and weird bodily states that can’t be easily translated into symbolic representation. The medieval theologian St. Thomas Aquinas personified these traits when – after a long career using Aristotelian logic to flesh out Catholic doctrine – he had a profound mystical experience near the end of his life. Despite being legendarily verbose, afterwards Aquinas refused to write ever again, claiming that “I have seen things that make my writings like straw.”
It’s a powerful story, but transcendent, life-changing mystical experiences that can’t be symbolically expressed are, by definition, not epistemically rational. And many secular rationalists argue that such things don’t count as knowledge. For these critics, a world full of people enslaved to private, un-shareable mystical intuitions would be a lot like…well, the Middle Ages: a disease-ridden, economically sclerotic world of petty theocracies, superstition, and extremely poor dental hygiene. Who wants that? (Almost) nobody, that’s who.
Your Brain Is a Racist
And so Very Smart writers such as Michael Shermer or Richard Dawkins have made their popular careers advocating explicit, reason-based, epistemic rationality, and casting suspicion on gut-level, authority-based, or associational thinking that cuts corners and relies on intuitive leaps. In singing the praises of this kind of rationality, Shermer, Dawkins, and their compatriots are advocating a somewhat caricatured view of what cognitive psychologists call System 2 thinking. As Daniel Kahneman explained to nearly the entire world in Thinking, Fast and Slow (it was a very popular book), System 1 cognition is intuitive, quick, holistic, and only partly conscious. It’s what you use when you, say, quickly size someone up at a party. System 2, meanwhile, is effortful, analytic, slow, and logical. It’s what you use to work out a calculus problem or a geometric proof. With System 2 thinking, you explicitly articulate your logical steps – the foundation of epistemic rationality.
Now, the high priests of Reason™ aren’t entirely wrong to tout the wonders of System 2. Besides wanting to avoid theocracy, there are lots of good reasons to be wary of easy, intuitive System 1 thinking. For starters, our tribal evolutionary past makes us intuitively mistrustful of others who look or speak differently than we do. Hence, many of us – including even Minnesotans, who are extremely nice – are racist by default. So one way to challenge our own racism is to override our intuitive System 1 biases by consciously switching over to System 2:
System 1 Brain: This guy has an accent and has strange hair and weird-colored skin! He’s not of our tribe! We MUST PERSECUTE HIM! BWARHGHGHGH!”
System 2 Brain (Closes eyes, takes a breath, counts to 10): Dude, for the hundredth time. You’re reacting on the basis of our inbred evolutionary biases. This guy is not a threat. He’s our boss.
System 1 Brain (Begins to calm down): Really?
System 2 Brain (Sighs): Yes.
So System 2 can be helpful for second-guessing our intuitive prejudices. It’s also wonderful for science, which is, in essence, the systematic use of explicit, effortful reasoning to challenge commonsense intuitions and thereby secure large government grants. This leads a lot of Spokespeople of Science and Reason to believe that System 2 is always the better choice, because System 1 is all bias and incoherence, and System 2 is all reason and good sense. We should always be questioning our assumptions, they intone gravely – always working things out rationally. But this isn’t just a misreading of cognitive psychology. It’s plain wrong.
Habits, Big and Small
Why is it wrong? Here’s a question. Have you ever tried to master a sport?
Great. Explain how to throw a baseball, please. In words. Rationally. Accounting for every step and micro-movement.
You can’t. At least, not usefully. Why? Because you practice your throw largely by doing it, over and over, until the muscle memory has become ground into your organism like grooves in a Smothers Brothers record. By the time you’ve mastered the skill, it flows unconsciously, automatically. You couldn’t enumerate every aspect of it verbally if you wanted to.*
This little athletic example (woo Royals!) highlights something crucial about human knowledge and decision-making: a lot of it isn’t explicitly conscious. It’s implicit, buried under layers of association, habit, or instinct. It might be what cognitive psychologists call “procedural” knowledge – knowledge of how to do something. Or, more broadly, it could be what physical chemist Michael Polanyi famously called “tacit knowledge” – experiential insight that can’t be made fully explicit.
This sort of knowledge – tacit, procedural, subconscious, intuitive – is exemplified in habits, which are essentially tools for outsourcing our decisions so that they don’t need to be conscious. A good throwing motion is a kind of habit. But so are many of our everyday actions. Once you’ve developed the habit of going to the gym every Monday, Wednesday, and Friday, you don’t need to decide to go anymore. You just go. Once you’ve built the habit of flossing every night (this one took a really long time for me), you don’t decide to floss anymore. You just brutally attack your gums, mindlessly, and hope the dentists are telling the truth about it being good for you.†
So habits, by definition, aren’t “rational” in the way we’ve been describing. Remember, epistemic rationality has to be consciously reasoned out – intuitive leaps and automatic responses are no-noes. All the logical steps have to be plain to see. Explicit. Representable. This is exactly the opposite of habits, which are basically programmed, unconscious responses. Their entire function is to bury our choices under layers of association and automatic behavior, thus freeing our minds for focusing on other, more pressing things.
One reason habits are so important is because our working memory – the thin slice of immediate, conscious attention that holds ideas together so we can rationally process them – can’t deal with very many chunks of information at once. The adage that we can only think about seven things simultaneously isn’t quite true, but it’s not far off, either. Mentally working out a chain of logic with more than a handful of interacting elements is a stretch even the smartest tech billionaires. This means that rational, System 2 thinking about social subtleties or athletic skills is practically impossible, because even the simplest social interaction or baseball fielding decision contains so many informational dimensions – so many possible variables – that it would completely flood our working memory if we tried to reason it out explicitly.
So we don’t. We avoid rationally processing most of the decisions that come our way each day, and we rely on intuition and gut feelings for a lot of life. We do this because we have to. If we tried to reason out every possible choice consciously, our little brains would melt.
Religion: The Habits of Societies
This brings us to religion, which, following Edmund Burke, you can think of as a kind of living record of the habits of societies or civilizations. Religious people are more intuitive thinkers, more responsive to authority, and more likely to reproduce. So religion is – among countless other things – a set of ways of thinking and doing that stay remarkably unchanged as they’re passed down from generation to generation.
For example, why do we celebrate Christmas? Well, largely because we always have, because we remember celebrating it as kids. Why do we pray toward Mecca? Well, maybe because the Prophet told us to. But also – and more topically – because our parents did, and their grandparents, and their grandparents before them.
Religions, then, are (in part) accretions of cultural habits and practices that have somehow stuck, have become preconscious habits instead of consciously reflective decisions. This holds true equally well at the cognitive, behavioral, and societal levels. Anti-theists and scions of Reason™ use this observation as the basis for accusing religion of being mindless, authoritarian, and irrational. But if they accuse religion of these things, then they have to accuse the entire swathe of human habit of the same sins: brushing our teeth every day automatically, for example. To be really logically consistent, the Yay-for-Science! types who admonish us about never listening to authority and rationally reasoning everything out for ourselves would have to consciously decide, every morning, whether it made sense to brush their teeth. Then they’d have to decide, rationally, whether it was a good idea to go to work. They’d need to agonize effortfully over whether to take their lunch break, and whether they should greet their spouse with a kiss that evening.
These examples are ridiculous. Nobody puts that much effort into their everyday choices – if we did, our lives would fall apart. But that’s the point. Habit qua habit isn’t good or bad. It’s necessary. And this goes at the level of society as well as individuals. We build habits into our cultural ways of doing things so that we we can direct our conscious attention towards more pressing problems. Religions – among the most conservative of human institutions – are bearers of much of this habit. As such, they’re epistemically “irrational” – just as all habits are. But this doesn’t make them necessarily instrumentally irrational, and that’s the key difference.
Of course, there are big problems facing our little civilization, and in many ways religious people can be, on average, behind the times. In the United States, they’re less likely to accept evidence for global warming, and much less likely to accept Darwinian evolution – which is the theory on which our entire understanding of the biosphere is based. That’s bad news.
The answer, though, isn’t to blithely insist that we have to be rational about everything. We don’t, and we can’t. Instead of pitching religion and its epistemically irrational intuitions to the wayside, we need to be realistic about the need for societies to develop large-scale habits, including behaviors and rituals that are passed down from generation to generation. The sheer amount of cognitive load that such cultural habits save for us is immeasurable. Forget that they’re irrational. They’re necessary.
The best scenario, then – for both individuals and societies – is to maintain an open conduit between System 1 and System 2, between intuitive habit and analytic questioning. If your environment changes, you might need to change your habits. This requires being open to new information, and perhaps an effortful – that is, explicit and conscious – period of building the new habit. The old habits will resist the new, at least at first. But this is how it should be. If habits morphed at the drop of a dime, they wouldn’t be habits. If religions changed with every passing fad, they’d fall apart (this could be the epitaph of many liberal Protestant churches).
By keeping the door open between our epistemically rational, analytic capacities and our intuitive, gut-level, epistemically irrational habits, we make it possible for information to flow to where it’s needed. By sneering at religion and habit, by proudly insisting that we should be paragons of Reason™ who never take anything on authority and reason everything out for ourselves, we’re saying that we don’t need any habits – that there’s no need for information to flow from the analytic arm to the holistic one. In fact, we should be lopping off the holistic one altogether. But if we did this, we’d only bleed to death.
*For those of you who were too cool for sports in high school, this also applies to video games.
† Thanks to Dave Barry for this joke which I stole.