Tavris and Aronson Explain Cognitive Dissonance

Carol Tavris and Elliot Aronson, authors of the amazing book Mistakes Were Made (But Not by Me), have an article on the CSICOP website explaining cognitive dissonance theory and why it’s one of the key ways that we undermine our own rationality.

Brain

The key motivational mechanism that underlies the reluctance to change our minds, to admit mistakes, and to be unwilling to accept unwelcome scientific findings is cognitive dissonance—the discomfort we feel when two cognitions, or cognition and behavior, contradict each other. Leon Festinger, who developed this theory sixty years ago, showed that the key thing about dissonance is that, like extreme hunger, it is uncomfortable, and, like hunger, we are motivated to reduce it. For smokers, the dissonant cognitions are “Smoking is bad for me” versus “I’m a heavy smoker.” To reduce that dissonance, smokers either have to quit or justify smoking. Before we make a decision (about a car, a candidate, or anything else), we are as open-minded as we are likely to be; but after we make a decision, we have to reduce dissonance. To do this, we will emphasize everything good about the car we bought or the candidate we are supporting or the belief we accepted and notice only the flaws in the alternatives.

Dissonance theory comprises three cognitive biases in particular:

1. The bias that we, personally, don’t have any biases—the belief that we perceive objects and events clearly, as they really are. Any opinion I hold must be reasonable; if it weren’t, I wouldn’t hold it. If my opponents—or kids or friends or partner—don’t agree with me, it is because they are biased.

2. The bias that we are better, kinder, smarter, more moral, and nicer than average. This bias is useful for plumping up our self-esteem, but it also blocks us from accepting information that we have been not-so-kind, not-so-smart, not-so-ethical, and not-so nice.

3. The confirmation bias, the fact that we notice and remember information that confirms what we believe and ignore, forget, or minimize information that disconfirms it. We might even call it the consonance bias, because it keeps our beliefs in harmony by eliminating dissonant information before we are even aware of it.

Dissonance is painful enough when you realize that you bought a lemon of a car and paid too much for it. But it’s most painful when an important element of the self-concept is threatened; your post-car-purchase dissonance will be greater if you see yourself as a car expert and superb negotiator. We have two ways to reduce dissonance: either accept the evidence and change the self-concept (“Yes, that was a foolish/incompetent/unethical thing to do; was I ever wrong to believe that”) or deny the evidence and preserve the self-concept (“That study was fatally flawed”). Guess which is the popular choice?

And while we are all quick to point out such behavior when engaged in by those we consider our adversaries or members of another tribe (political, racial, religious, etc), we rarely recognize it in ourselves. This is the essence of tribalism and none of us are immune to it entirely. This also relates to what Jonathan Haidt meant when he said that our conscious minds act like press secretaries — their job is to justify the decision we made or the belief we hold dearly to, not to help us reach such a belief or decision in a rational manner. This also helps explain the pyramid effect, which they discuss in their book:

The greatest danger of dissonance reduction occurs not when a belief or action is a one-time thing like buying a car, but when it sets a person on a course of action. The metaphor that we use in our book is that of a pyramid. Imagine that two students are at the top of a pyramid, a millimeter apart in their attitudes toward cheating: it is not a good thing to do, but there are worse crimes in the world. Now they are both taking an important exam, when they draw a blank on a crucial question. Failure looms, at which point each one gets an easy opportunity to cheat by reading another student’s answers. After a long moment of indecision, one spontaneously yields and the other resists. Each gains something important, but at a cost: one gives up integrity for a good grade; the other gives up a good grade to preserve his integrity.

As soon as they make a decision—to cheat or not—they will justify the action they took in order to reduce dissonance, that is, to keep their behavior consonant with their attitudes. They can’t change the behavior, so they shift their attitude. The one who cheated will justify that action by deciding that cheating is not such a big deal: “Hey, everyone cheats. It’s no big deal. And I needed to do this for my future career.” But the one who resisted the temptation will justify that action by deciding that cheating is far more immoral than he originally thought: “In fact, cheating is disgraceful. People who cheat should be expelled.” By the time they finish justifying their actions, they have slid to the bottom and now stand at opposite corners of its base, far apart from one another. The one who didn’t cheat considers the other to be totally immoral, and the one who cheated thinks the other is hopelessly puritanical—and, come to think of it, why don’t I just buy the services of a professional cheater to take the whole course for me? I really need the credits, and so what if I never learn what this class requires? I’ll learn on the job. Hey, neurosurgery can’t be that hard.

As we go through life we will find ourselves on the top of many such metaphorical pyramids, whenever we are called upon to make important decisions and moral choices: for example, whether to accept growing evidence that a decision we made is likely wrong; decide whether or not a sensational rape or murder case in the media is true; whether to blow the whistle at company corruption or decide not to rock the boat. As soon as we make a decision, we stop noticing or looking for disconfirming evidence, and we are on that path to the bottom, where certainty lies.

We crave certainty, illusory or not. Sometimes we are able to rationally assign a high level of certainty to a claim or position, but often we can’t. And that makes most people uncomfortable. Personally, I try to cultivate an attitude of being okay with nuance and ambiguity, though I’m sure I often fail to do so. The key to overcoming this as much as humanly possible — which will always be only partial — is to develop habits of mind that require self-questioning and self-awareness. Also, to surround yourself with people you respect enough that, if they tell you you’re not thinking clearly about a given subject, you respond by taking them seriously and thinking deeply about it rather than becoming defensive. It’s not an easy thing to do.

"Maybe the forcing of payment for past years will be for non Christians only. That ..."

FFRF Wins Lawsuit Over Parsonage Exemptions
"Why do they assume that they will be on the giving end of the tyranny ..."

White Supremacist Thinks Women Should Not ..."

Browse Our Archives

Follow Us!


What Are Your Thoughts?leave a comment