Post hoc rationalisation – reasoning our intuition and changing our minds

Post hoc rationalisation – reasoning our intuition and changing our minds November 14, 2013

Post hoc rationalisation is what most of us end up doing when we reason. We have a gut instinct, a potentially irrational or a-rational decision based on the underlying cognitive faculties connected to our whole person: physical reactions and gut instincts.

For those of you who have not watched this utterly brilliant TED talk from David Pisarro, do so now. It shows how we morally judge based on intuition, the idea of which being that we rationalise this ‘decision’ after the fact. Gut reaction first, think up the reasons later.

The excellent Jonathan Haidt, a philosophical psychologist has done a lot of research in this area. As the New York Times says in a review about his book The Righteous Mind:

To the question many people ask about politics — Why doesn’t the other side listen to reason? — Haidt replies: We were never designed to listen to reason. When you ask people moral questions, time their responses and scan their brains, their answers and brain activation patterns indicate that they reach conclusions quickly and produce reasons later only to justify what they’ve decided. The funniest and most painful illustrations are Haidt’s transcripts of interviews about bizarre scenarios. Is it wrong to have sex with a dead chicken? How about with your sister? Is it O.K. to defecate in a urinal? If your dog dies, why not eat it? Under interrogation, most subjects in psychology experiments agree these things are wrong. But none can explain why.

The problem isn’t that people don’t reason. They do reason. But their arguments aim to support their conclusions, not yours. Reason doesn’t work like a judge or teacher, impartially weighing evidence or guiding us to wisdom. It works more like a lawyer or press secretary, justifying our acts and judgments to others. Haidt shows, for example, how subjects relentlessly marshal arguments for the incest taboo, no matter how thoroughly an interrogator demolishes these arguments.

To explain this persistence, Haidt invokes an evolutionary hypothesis: We compete for social status, and the key advantage in this struggle is the ability to influence others. Reason, in this view, evolved to help us spin, not to help us learn. So if you want to change people’s minds, Haidt concludes, don’t appeal to their reason. Appeal to reason’s boss: the underlying moral intuitions whose conclusions reason defends.

Of course, this is disheartening for people like me who set out to change people’s mind through rational discourse, presenting empirical evidence in the process. What this actually appears to do is entrench people in their original beliefs, rather than change their minds. As Joe Keohane states:

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

Keohane continues in explaining how this misinformation can be dangerously entrenched:

On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.

New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

I was talking to a theologian friend of mine who has become progressively more liberal the more he researches theology, but primarily psychology, on his own terms. His ability to change his mind is not about listening to me, disagreeing with him, it’s more about researching ideas on his own terms, having ownership over that research so that it is him who has responsibility for changing his own mind. This is perhaps the strength behind the recent move for organisations to take on coaching as a method – using Socratic dialectic methods to release new thoughts and ideas from the coachee themselves.

I remember, with aforementioned Christian friend, that he used to deny evolution. I used to hit him with facts and reason, which was met with strong cognitive dissonance and an entrenchment of view. I did not convince him otherwise. However, he struck up a conversation with theistic evolutionary biologist Simon Conway-Morris who, over several emails of presenting the same evidence as me, managed to convince him otherwise.

The power of a ‘friend’ – ie someone who broadly shares the same worldview – to change one’s view cannot be overestimated. Belief is deeply psychological. If we want to convince people to change their minds, we have to pick the cognitive and psychological locks which permit this. This includes choosing the right person to do the unpicking.

Belief in God is one of the strongest, most pervasive beliefs there can be, with ramifications across the believer’s life. Rational evidence and argument, however, won’t cut the mind-changing mustard. More’s the pity.


Browse Our Archives