menu

Maybe people CAN change their minds after all

Maybe people CAN change their minds after all September 10, 2021

If someone has a belief that is objectively wrong—that is, a belief that an unbiased observer equipped with all relevant facts would judge as false—giving the correct information isn’t likely to get them to change their mind. Indeed, they usually double down on their original, false belief. This is the Backfire Effect.

But this corrective to someone with a wrong belief feels so right! The other guy has come to the wrong conclusion, and once I give him the correct facts, he’ll cheerfully thank me and switch to the correct opinion, right? That sounds reasonable, but no—changing one’s opinion is painful, and this response to a person with a wrong believe only makes the problem worse.

I explored approaches that minimize the Backfire Effect, but here’s an approach that should be more productive (taken from “Why Facts Don’t Change Our Minds” by Elizabeth Kolbert). But more on that shortly. Getting there is an interesting journey.

Study 1: biased weighing of data

The article gives a number of studies that reveal the embarrassingly inept way our minds sometimes work. In one study, half of the participants were in favor of capital punishment and half not. Each participant was given two studies that argued the two sides of the issue. These studies were actually made up, but they presented data that was equally compelling mathematically. Participants reported that the one that supported their own opinion was far more compelling than the other (this is confirmation bias). Afterwards, they were asked about their views. Unsurprisingly, they were more entrenched than they’d been at the start. This is the Backfire Effect.

Why are we susceptible to poor thinking?

This human failing enables America’s new vogue of fake news and alternative facts. But since this thinking isn’t logical, why do people do it? Why are they biased toward confirming evidence, and why does presenting disconfirming evidence force them to double down?

Since this is pretty much universal, it’s an evolved trait, but what survival value could it have? Shouldn’t believing true things always be better?

Some researchers say that it developed in a society where humans had to work together. A cooperating society wants to encourage members who contribute, but it must punish freeloaders—possibly even to the point of exile. That’s a draconian punishment because, in a primitive society, living on your own is much harder than being part of a community.

Human reason didn’t evolve to weigh economic policy options or evaluate social safety nets, but, according to this theory, it evolved to defend one’s social status. Winning arguments is important, and self-confidence helps. Doubting your position is not a good thing. The reflective tribal member who says, “Well, that’s a good point—maybe my contribution to the group has been sub-par” risks exile.

(Another area of thought where we are surprisingly poor is probability—surprising because we seem to bump into simple probability questions all the time. I’ve written about the Monty Hall Problem here and about simple puzzles that reveal our imperfect thought process here.)

Study 2: explain your answer

In this study, graduate students were first asked to evaluate their understanding of everyday devices—toilets, zippers, cylinder locks, and so on. Next, they were asked to write a detailed explanation of how the devices worked. Finally, they again rated their understanding of these devices. Being confronted with their incompetence (yeah, how does a toilet work?!) caused them to lower their self-rating.

It’s easy to think of how we operate these simple gadgets (just the user interface) and overestimate our understanding of how it works under the hood. This encapsulation is important for progress—you don’t understand how a calculator works but you know how to operate it. The same is true (for most of us) for a car, a computer, a cell phone, or the internet. We know how to buy hamburger or a suit, but we don’t understand the particulars of how they got to the store.

This encapsulation extends into public policy—we (usually) don’t understand the intricacies of policy proposals like cap and trade or trade deals like NAFTA or TPP. Instead, we rely on trusted politicians and domain experts to convince us of the rightness of one side of the issue (or more likely, do what needs to be done and not bother us with it).

Study 3: policy questions

That brings us to one final study, modeled on the last one. Participants were asked their opinions on policy questions like single-payer health care or merit-based pay for teachers and then were asked to rate their confidence in their answers. Next, they were asked to explain in detail the impact of implementing each proposal. Finally, they were asked to reevaluate their position. Having just struggled to explain the details of their favored proposal, they dialed back their confidence.

This finding may be relevant to our interactions with people arguing for scientific or historical claims like Creationism or the Resurrection, or for social policies like making abortion illegal or “natural marriage” (one man, one woman). Instead of pushing back, ask them to explain their position. Let them stew in their own confusion. Avoid the snarky retort (tempting, I know), which would trigger the Backfire Effect.

This research is equally applicable to ourselves. Find or create opportunities to explain how your favored policy, if implemented, would work and then ask yourself how this exercise changes your opinion. Is it still a no-brainer? Or have you uncovered obstacles that might make success more elusive?

How do they interpret your question?

When you ask someone, “Do you accept evolution?,” you may see this as a straightforward question about opinion or knowledge. For some, however, you’re asking about who they are. “I am a Christian,” they think, “and my kind of Christian rejects evolution.” Your straightforward question becomes in their mind, “Do you reject Jesus Christ as Lord and savior?,” to which the answer is, obviously, No. Other personal questions potentially fall into the same trap—questions about abortion or same-sex marriage or even climate change or COVID vaccinations.

Physician, heal thyself

Consider working on this problem within yourself, and you can begin to understand how hard it would be to do a 180 on some central tenet of who you are. And that’s how hard it is for the other person.

I’ll use myself as an example. Correcting myself on some heated issue in the news isn’t that big a deal. For example, I’ve mocked people for insisting that hydroxychloroquine or ivermectin protects them against COVID, but I’ll quickly make an about face once the medical community does. Imagine instead that I rejected my atheism and concluded that Christianity is true. That would be a big deal, especially since undercutting Christianity in this blog has been my job for the last decade.

For some insight, let’s turn to “Facts Don’t Change People’s Minds. Here’s What Does” by Ozan Varol. To help walk back a belief you no longer hold:

The key is to trick the mind by giving it an excuse. Convince your own mind (or your friend) that your prior decision or prior belief was the right one given what you knew, but now that the underlying facts have changed, so should the mind.

Gracefully accept the new, correct view from your debate opponent and avoid the tempting “Told you so.”

Going forward, avoid a dogmatic kind of declaration. Separate yourself from the facts so that if the “facts” eventually go down the toilet they don’t pull you with them.

A possible solution, and one that I’ve adopted in my own life, is to put a healthy separation between you and the products of you. I changed my vocabulary to reflect this mental shift. At conferences, instead of saying, “In this paper, I argue . . .,” I began to say “This paper argues. . . .”

This subtle verbal tweak tricked my mind into thinking that my arguments and me were not one and the same.

Finally, remember that from their point of view your opponent’s position makes good sense. Obviously—if it didn’t, they’d hold some other position.

Now imagine you’re arguing for accepting climate change.

If employment is the primary concern of the Detroit auto worker, showing him images of endangered penguins (as adorable as they may be) or Antarctica’s melting glaciers will get you nowhere. Instead, show him how renewable energy will provide job security to his grandchildren. Now, you’ve got his attention.

The same part of the brain that responds to a physical threat
responds to an intellectual one.

Oatmeal comic

.

(This is an update of a post that originally appeared 3/23/17.)

Image from Wesley Eller (license CC BY 2.0)

.

"Satan's rebellion proves that heaven is not perfect. If rebellion isn't a sin, what will ..."

God’s life is hell
"Veni, vidi, vici. I came, I caught, I cankered."

God’s life is hell
"Typo! Fixed. Molinism, God's middle knowledge. God knows all facts and all counterfactuals. God knows ..."

God’s life is hell

Browse Our Archives