Chris Mooney reports on a new study that I don’t find surprising in the least. The study, called Motivated Numeracy and Enlightened Self-Government, finds that our political beliefs can even undermine our ability to do basic math when we encounter data that does not fit those beliefs. It’s really quite a clever study, as Chris explains:
The study, by Yale law professor Dan Kahan and his colleagues, has an ingenious design. At the outset, 1,111 study participants were asked about their political views and also asked a series of questions designed to gauge their “numeracy,” that is, their mathematical reasoning ability. Participants were then asked to solve a fairly difficult problem that involved interpreting the results of a (fake) scientific study. But here was the trick: While the fake study data that they were supposed to assess remained the same, sometimes the study was described as measuring the effectiveness of a “new cream for treating skin rashes.” But in other cases, the study was described as involving the effectiveness of “a law banning private citizens from carrying concealed handguns in public.”
The result? Survey respondents performed wildly differently on what was in essence the same basic problem, simply depending upon whether they had been told that it involved guns or whether they had been told that it involved a new skin cream. What’s more, it turns out that highly numerate liberals and conservatives were even more—not less—susceptible to letting politics skew their reasoning than were those with less mathematical ability.
The math here is really pretty basic. In the skin cream version of the study participants were asked about, there were, of course, two groups — those who used the skin cream and those who did not. In the group that used the skin cream, 223 had the rash get better and 75 had the rash get worse; in the group that did not use the skin cream, 107 had the rash get better and 21 had the rash get worse (they actuallly changed the labels around for different groups of people). And they were asked whether those results supported one of two conclusions, that those who use the skin cream were more likely to get better or worse. By having everyone answer that question, which did not involve a political question on which they would have formed an opinion, they got a baseline for each participant’s ability to analyze this basic numeracy question and answer the question correctly. And then they used the same numbers, but made the study about gun control.
Not surprisingly, Kahan’s study found that the more numerate you are, the more likely you are to get the answer to this “skin cream” problem right. Moreover, it found no substantial difference between highly numerate Democrats and highly numerate Republicans in this regard. The better members of both political groups were at math, the better they were at solving the skin cream problem.
But now take the same basic study design and data, and simply label it differently. Rather than reading about a skin cream study, half of Kahan’s research subjects were asked to determine the effectiveness of laws “banning private citizens from carrying concealed handguns in public.” Accordingly, these respondents were presented not with data about rashes and whether they got better or worse, but rather with data about cities that had or hadn’t passed concealed carry bans, and whether crime in these cities had or had not decreased…
So how did people fare on the handgun version of the problem? They performed quite differently than on the skin cream version, and strong political patterns emerged in the results—especially among people who are good at mathematical reasoning. Most strikingly, highly numerate liberal Democrats did almost perfectly when the right answer was that the concealed weapons ban does indeed work to decrease crime (version C of the experiment)—an outcome that favors their pro-gun-control predilections. But they did much worse when the correct answer was that crime increases in cities that enact the ban (version D of the experiment).The opposite was true for highly numerate conservative Republicans: They did just great when the right answer was that the ban didn’t work (version D), but poorly when the right answer was that it did (version C).
This is a fascinating but not at all surprising result. When we have a firmly held belief about an issue, our ability to think rationally about it is quite often reduced — and the more passionately we hold that belief to be true, the less rational we are likely to be when evaluating evidence that might suggest that belief to be false. Confirmation bias, motivated reasoning, defensiveness and self-justification all undermine our ability to be objective in such situations. And the smarter we are, counter-intuitively, the worse we are likely to be at it.
Kevin Drum reacts to these results:
On the other hand, the effect size is pretty stunning. There’s a huge difference in the rate at which people did the math correctly depending on whether they liked the answer they got. I’d like to see some follow-ups with more subjects and different questions, but it sure looks as if we’d probably see the same dismal effect.
How big a deal is this? In one sense, it’s even worse than it looks. Aside from being able to tell that one number is bigger than another, this is literally about the easiest possible data analysis problem you can pose. If ideologues actively turn off their minds even for something this simple, there’s really no chance of changing their minds with anything even modestly more sophisticated. This is something that most of us pretty much knew already, but it’s a little chilling to see it so glaringly confirmed…
We believe what we want to believe, and neither facts nor evidence ever changes that much. Welcome to planet Earth.
And this is not limited to politics, of course. We will tend to do the same thing with our religious beliefs (or beliefs about religion) and even in our personal relationships. It also suggests, as Mooney notes, that it isn’t simply a matter of educating people or showing them the facts to make them think more rationally about an issue:
For study author Kahan, these results are a fairly strong refutation of what is called the “deficit model” in the field of science and technology studies—the idea that if people just had more knowledge, or more reasoning ability, then they would be better able to come to consensus with scientists and experts on issues like climate change, evolution, the safety of vaccines, and pretty much anything else involving science or data (for instance, whether concealed weapons bans work). Kahan’s data suggest the opposite—that political biases skew our reasoning abilities, and this problem seems to be worse for people with advanced capacities like scientific literacy and numeracy. “If the people who have the greatest capacities are the ones most prone to this, that’s reason to believe that the problem isn’t some kind of deficit in comprehension,” Kahan explained in an interview.
In short, ideology usually trumps rationality — and the smarter we think ourselves to be, the more likely we are to fall victim to it.