Barack Obama is a Christian. He easily passes the tests you’d give to anyone else: he uses Christian language, he goes to church, and (most importantly) he says he’s a Christian!
It’s been fact checked, as if that would be necessary. Turns out that, yes, he’s a Christian.
But you wouldn’t know it from the polls. In March 2008, before Obama was elected president, polls showed 47% of Americans accepted that he was Christian, 12% said Muslim, and 36% didn’t know. With time, this groundless confusion should dissolve away, right? Nope. Four years later, the 2012 poll showed similar results.
Another poll in Mississippi found 12% saying Christian and 52% Muslim (and 36% Don’t Know). Among “very conservative” voters, it was 3% Christian, 58% Muslim, and 39% Don’t Know. That was in 2012. In America.
This example shows that we well-educated moderns don’t always accept obvious facts. Who could then doubt that first-century Christians might not have recorded events with perfect accuracy? But that’s just a corollary observation. I want to instead explore how this deeply embraced misinformation gets in our heads and stays there.
The natural response for skeptics like me is to suppose that misinformed people simply don’t have the correct facts. These people are eager to know the truth, and if we provide them with the facts, the misinformation will vanish.
In some cases, this is true. A correction that doesn’t push any buttons can work. It’s easy to accept a more efficient driving route to work or a new accounting policy. In situations like politics, however—as the “Obama isn’t a Christian” example shows—things are more complicated. And here’s the crazy thing: presenting people with the correct information can reinforce the false beliefs. That’s the Backfire Effect.
One helpful article (“How facts backfire”) notes that it’s threatening to admit that you’re wrong, especially where one’s worldview is involved, as with politics and religion. The article calls the Backfire Effect a defense mechanism that avoids cognitive dissonance.
In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions.
It gets worse. I’ve written before about the critical but often overlooked difference between confidence and accuracy in memories, how a confident memory isn’t necessarily an accurate one. Studies of the Backfire Effect show that those people most confident in their grasp of the facts tended to be the least knowledgeable about the topic. That is, those most in need of correcting their beliefs are least likely to do so.
This isn’t just an academic issue. These people are voters, and their ignorance affects public policy.
(As an aside, this is related to the Dunning-Kruger effect in which more competent people rate their ability less than it actually is, while less competent people do the reverse. The hypothesis is that the less competent people were too incompetent to appreciate their own incompetence.)
How can we humans be as smart as we are but have this aversion to correct information? The human brain seems to seek consistency. It’s mentally easy to select confirming information and ignore the rest. Reevaluating core principles is difficult and stressful work.
Let’s not be too hard on ourselves, though. If we had to continually reevaluate everything, we’d never get out of bed in the morning. Cognitive shortcuts make sense, usually.
In part 2, we’ll conclude with a look at how to correct misinformation without triggering the Backfire Effect.
The door of a bigoted mind opens outwards
is to close it more snugly.
— Ogden Nash
Photo credit: adrian capusan