In a previous post about a wingnut claiming that the Obama administration is allowing Muslims to skip TSA checks at the airport, Dr. X left one of the most lucid and important comments I’ve seen on my blog in more than 8 years of doing this. It was in response to this comment by Michael Heath:
I’m still mulling over Christopher Hitchen’s conclusion that, “religion poisons everything”. However the fact that religious indoctrination systemically taught to children develops people who’ll believe irrational assertions lacking any evidence, well that’s one compelling premise which supports Mr. Hitchen’s conclusion.
And this is Dr. X’s full response:
There is abundant evidence from psychological research that irrationality is built into us, and that pre-rational heuristics govern our beliefs far more than rationality. Ideologies, social bonds and group identifications, not training, determine the ability of most of us to process evidence in dealing with matters related to our sense of social alliances. It’s easy to see how this powerful tendency was selected in human beings and no reason to assume that vulnerability to bias is trained into people or that we can be trained out of bias in some general sense during childhood. My mother loved me when I was born, not because of any inherent quality in me that made me more worthy of love. She regularly acted with disregard for her own interests to protect me. So much of our survival is based on pre-rational, preconscious tendencies and heuristics, while reason is a rather lowly step-child in social relations, and that would generally hold true for group identity.
Reasoning is only a shaky overlay on non-conscious, pre-rational processes. One problem is that reasoning can actually be used quite effectively to support pre-determined views. In people who are more intelligent, the tendency is all the more pronounced. Smart people can talk themselves into a lot of things that aren’t true and sound very thoughtful and rational while they do it. A less intelligent person will be more comfortable with blunt denial. A brighter person will erect complex intellectual systems of justification without awareness that their opinion was already formed. It isn’t that those systems and reasons are necessarily wrong; quite often they’re right. The point is that the intellectual wouldn’t have gotten there if a strong tribal identification was standing in the way. Quite often, smart people who can earn an A+ in a Logic 101, go into false territory and cannot be talked out of it because of an existing group identification and the perception that the enemy holds an opposing position.
I think wingnuts are especially crazy at this time because of the power of their group identity and the perception of a serious threat to the group. The evolved adaptive response to this situation is to become nuts in support of the tribe and nuts in contempt for the opposition.
I also think that the history of the American South in relation to race and to Washington, the capital of the n-loving conqueror, is at the core of the wingnut identity. Republicans have ruthlessly and successfully exploited a fusion between downtrodden (read threatened) Southern, white group identity and an assortment of geographical, educational and religious markers, as well as a variety of cultural tics and habits that extend the identity well beyond the American South. In each of these areas, the idea has been promoted that white people, especially non-urban white men, are members of a unified tribe that is in a fight for its life, under attack from outside enemies. A person who identifies with that group and takes on that sense of life or death threat will not join the discussion as a reasoning, evidence-processing participant. They’re at war with the mental equivalent of Hitler or Stalin or pick your historical enemy who was beyond a reasoning and good-faith discussion.
Why do some of us decouple from our early group identities and change our beliefs? I think there are many circumstantial and internal reasons that it can happen. I used to think I reasoned my way out of my early tribal alliances, but I’m now convinced that reason was only introduced to the extent that my tribal bonds were fraying for other reasons.
Religion, rather than poisoning everything, is IMO usually a group identity not unlike any other group identity. It’s impact become poisonous when that identity feels threatened, but that’s not because religious people aren’t taught to reason. It’s because of group identification and evolved responses to group threat.
I agree with almost all of this. He brilliantly expresses the kind of cognitive shortcuts that we all take at times, shortcuts that skip over rational thinking to reach conclusions that aren’t supported by the evidence. And please note the key fact: We all do this. I know we’d like to think that we’re perfectly rational and immune to such things while those whose views we oppose are infinitely irrational, but the fact that virtually everyone thinks that about themselves is, in fact, evidence for the argument made by Dr. X here.
But that isn’t the end of it. While it’s true that all of us have beliefs that we cling to for pre-rational reasons, particularly group identity, we should not draw two potentially false conclusions based on that fact: A) that everyone is equally irrational; or B) that we therefore can’t use the tools of reason to reach more or less definitive conclusions about what is true and what is not. Note that I’m not saying that Dr. X is reaching those conclusions, only that these would be obvious — and wrong — conclusions to reach.
So is religion different from other pre-rational ideologies and group identifications? Does religion poison everything or is religion just another form of tribal and ideological identification that short circuits our ability to think rationally? The answer, I think, lies somewhere in between. Yes, people can and do engage in the same kind of evidence-ignoring faulty reasoning in the service of non-religious ideologies, but I believe religion is significantly more damaging than the alternatives, for reasons that I think Dr. X would agree with as well.
First, because religion claims to be based on supernatural revelation that is unquestionable. While it’s true that almost every “worldview” (I hate that word but can’t think of a better replacement at the moment) can be clung to and defended irrationally, religious ideologies are considerably worse in this regard because they begin with the very premise that one cannot question those claims because they come directly from God, who will punish you if you do not believe (especially, in many cases, if you stop believing).
We can contrast this with a secular and skeptical worldview, which begins with the premise that we should apply the tools of reason to better understand the world and which has no threat of an afterlife to compel one not to question. It’s certainly true that skeptics, both individually and in groups, can and do engage in the kind of ostracism that can happen in any community in response to those who challenge the positions taken by the majority of people in the community, that response is at least not built in to the belief structure. Tribal thinking is possible in any group, of course, but skepticism, properly understood and put into practice, demands that it be avoided as much as possible; religion, on the other hand, positively encourages such irrationality with multiple levels of punishment for those who question, both immediate and for all eternity.
Next, to the question of how, if irrationality and tribalism is so common in humans, many of us manage to escape our tribes and develop a new set of beliefs. Dr. X speaks to his own experiences when he says, “I used to think I reasoned my way out of my early tribal alliances, but I’m now convinced that reason was only introduced to the extent that my tribal bonds were fraying for other reasons.” My experience was considerably different, largely because I didn’t have such tribal bonds.
As I’ve mentioned many times, I was raised by an atheist and a Pentecostal. That made for some odd situations, but in retrospect I’m happy about that. It meant that I didn’t have a default belief. It meant I had to think about it in a way that most people never do because I had these two starkly different worldviews in the same home. As a teenager, I became a very devout Christian and was one of the leaders of the local Youth for Christ group. But by the time I was 17 or so, I was asking questions that didn’t have good answers and by 18 or 19, I’d became a skeptic and a rationalist.
At no time during all of that did my atheist father try to talk me out of religion or into non-belief. When I asked him about that later he said, “I didn’t think I had to. I’d raised you to think for yourself and I thought you’d figure it out on your own.” One of the phrases he used a lot, and still uses to this day, is “do your own therapy.” By that he means, ask yourself why you feel the way you do, why you think the things you think, what is motivating you to believe or act in a certain way. And I think that is the essence of true skepticism, to not only question what others believe but what you believe as well. Is it warranted? Does the evidence support it? Are you really being rational here or are you taking a shortcut? If you haven’t taken the time to really do a thorough analysis, don’t reach firm conclusions and don’t make grand statements about that subject.
In recent years, the voice of my father in my head has been accompanied by the voice of one of my good friends, Jeremy Beahan (host of the Reasonable Doubts podcast, producer of my old radio show). Jeremy is one of the most truly rational and skeptical people I’ve ever met. He avoids logical fallacies more consistently than almost anyone I know, and he’s called me out more than once when I’ve oversimplified or engaged in tribalism rather than being genuinely skeptical. I wish I could say that I therefore avoid doing so all the time, but I can’t. I still catch myself at it sometimes and, I’m sure, don’t catch myself at other times when I should.
Jeremy also introduced me to an incredible book that I’ve mentioned many times on this blog, Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris and Elliot Aronson. This book is absolutely vital to understanding the kinds of pre-rational heuristics that Dr. X is talking about above. And once we understand the human tendency toward those kinds of thinking patterns, we are better able to catch ourselves doing it and apply the tools of reason to avoid them as much as possible.
So what lessons might we draw from this? Yes, religion operates much like any system of group identity operates. But it’s also unique in its inoculation against rational challenge and its ability to prevent members of the group from questioning their core beliefs. And yes, we all behave in similar ways at times. But the only worldview that offers a means of transcending those pre-rational thought patterns to any significant degree is skepticism. We need people around us who challenge us not to conform but to think. George Carlin stated this perfectly when he suggested that we don’t just teach people to read but we teach them to question what they’ve read. None of us will ever be Mr. Spock, but some worldviews really do tend to make us more rational than others.