You wake up one morning to a phone call. On the other end, a friend’s voice excitedly tells you that he knows where to find a leprechaun’s pot of gold in a nearby park. Do you want to come help him dig for it? If you are a normal person, you will roll your eyes, hang up, and (hopefully) go back to sleep. But according to new research from the University of Iowa, if you have damage to a specific area of your neocortex related to doubt and skepticism, you might jump up and start looking around for your shovel. And guess what – if so, you’re also more likely to be a religious fundamentalist.
We all know how important it is to critically evaluate the truth claims we encounter, whether from friends, media outlets, or religious leaders. If you’re too gullible, you might get fleeced by scam artists, choose poor romantic partners or friends, or spend a lot of hard-earned money on miracle weight-loss drugs. And yes, you might also be more likely to believe religious claims that mainstream culture finds odd or even delusional – for example, fundamentalist claims about a six-day creation or the literal inerrancy of the Bible.
Drawing from recent research in neuroscience, Erik Asp, Kanchna Ramchandran, and Daniel Tranel of the University of Iowa College of Medicine formulated and tested a model that shows how brain function is related to gullibility, beliefs, and religious fundamentalism. Imaging studies have shown that critical evaluation of beliefs and truth claims is related to the ventromedial prefrontal cortex (vmPFC) – the part of your brain more or less directly over the space between your eyes – which suggested that damage to this area should lead to reduced doubt and increased gullibility.
Asp and his colleagues formulated a model they dubbed the “False Tagging Theory” to describe how the vmPFC drives the brain’s doubting process. Perhaps counterintuitively – but importantly – they assumed that the brain initially believes all claims it encounters. In fact, according to their model it’s not even possible to understand anything you hear unless you first provisionally believe it. That’s right: you’ve believed everything anyone has ever told you, if only for a fraction of a second, from the amount you owed for that takeout Chinese dinner in 2005 to the existence of Santa Claus. The difference comes when the vmPFC evaluates the ramifications and implications of that belief and determines whether to “flag” that belief with a doubt marker. Obviously false or manipulative claims receive the “doubt tag” and are then emotionally experienced as feeling untrue.
The model, hinging as it does on the assumption that we first need to provisionally believe something in order to evaluate its truth, predicts that if something goes wrong with the vmPFC then we should experience problems with the doubt-tagging process. Our initial affirmative beliefs will persist, and we’ll continue believing outlandish or unlikely claims.
Asp, Ramachandran, and Tranel tested their model by recruiting a group of patients from the University of Iowa Hospitals and Clinics who had suffered damage to the ventromedial prefrontal cortex. They compared these patients to others who had suffered damage to unrelated areas of the brain, as well as to a third group of patients who had recently undergone medical non-neurological emergencies. By choosing control groups that were also recovering from recent medical trauma, the researchers hoped to ensure that any effects they might find among their target patients wouldn’t just be the result of having experienced a profound medical event.
The participants, who were similar across groups in age, gender, and religious affiliation, filled out a survey that measured how fundamentalist their religious beliefs were. Another survey gauged the extent to which their personalities could be described as “authoritarian,” while a third questionnaire asked whether patients’ beliefs in four specific Christian doctrines had changed since their injury.
As the investigators expected, participants who had experienced damage to their vmPFCs showed significantly higher levels of fundamentalist belief than the participants in the two control groups. They also reported greater increases in religious belief since their injury, as well as higher levels of authoritarianism. Neither the participants with damage to unrelated areas of the brain nor the patients with non-neurological injuries or illnesses showed comparable results. Interestingly, the vmPFC patients also didn’t show any deficits in IQ or tests of executive cognitive function. The only area of cognitive functioning that seemed to have been affected was the ability of the brain to label certain beliefs as doubtful or false.
The fact that the vmPFC patients showed higher authoritarianism also fits into the researchers’ model. Authoritarianism is a psychological construct that theorists and researchers have used since the 1950s to describe a personality that prefers strong authority figures, tends to accept leadership uncritically, and can be brusque and punitive in dealings with social inferiors. Asp and his colleagues predicted that patients with damage to the ventromedial prefrontal cortex would express more authoritarianism because authoritarians are averse to questioning beliefs handed down from social superiors and feel uncomfortable when they’re not sure what to believe. Critical doubt, in other words, isn’t a strong suit for authoritarians. The fact that the patients with damaged vmPFCs did, in fact, show elevated levels of authoritarianism gave Asp and his team yet more support for their model.
So what do these results mean for religion and belief? Is religious belief merely the product of a malfunctioning “doubt mechanism” in the frontal cortex? Of course not. Any sentence that begins with “Religious belief is merely…” is guaranteed to offer an incomplete picture at most, and sophomoric, intolerant posturing at worst. Religious belief is extremely complex, and it’s impossible, even foolish, to try to reduce it to one certain brain function or another (or lack thereof).
But it’s also true that some religious beliefs may be easier than others to incorporate into a worldview that jibes with modern science and culture. Unfortunately, fundamentalist religious claims don’t tend to fit into this category – a fact that, in America, is helping to drive a dangerous rift between conservative religious believers and their (often) secular, scientifically educated counterparts. Part of what it means to study religion critically is to confront the fact that all religious phenomena aren’t necessarily good, and that some religious beliefs may be worth criticizing from an evaluative perspective. This is a tricky line to draw, and the results presented by Asp and his colleagues probably don’t warrant calling up your fundamentalist Christian uncle to accuse him of brain damage. (Actually, that might be a bad idea even if they did.)
What this research does do, though, is raise the possibility that some religious beliefs may be more vulnerable than others to the cognitive processes that lead to doubt. Presumably, the more logically fragile and evidentially unsupported religious claims are, the more belief in them will differ between people with high- and low-functioning vmPFCs. More theologically subtle or sophisticated beliefs, for instance those propounded by the medieval mystic Meister Eckhart – who insisted that God was not a person, but instead an underlying reality that couldn’t be expressed in words – might not be so vulnerable to these doubting mechanisms. This doesn’t mean that one kind of religion is automatically better than another. It just means that, like everything else our brains encounter, religion is something worth examining critically.