The Psychology Behind Why We Reject Evidence

The Psychology Behind Why We Reject Evidence June 28, 2016


Many of us have had the frustrating experience of debating someone who had an argument that was not based on any sort of solid evidence. We call out the person’s faulty reasoning and they seem totally unphased. They end up saying something like we are “not being open-minded enough” or that they are “entitled to their opinion.” This of course is not fun because such discussions ultimately become rather circular. For example:

Person A: “Vaccines cause autism”

Person B: “Is there a scientific study to support that claim?”

Person A: “No because science is a conspiracy and scientists are all paid shills”

Person B: “That’s ridiculous”

Person A: “You just need to be open-minded”

Person B: “I am open-minded, but you don’t have any good evidence”

Person A: “I already proved my point, you are not being open-minded enough”

It’s important to be open-minded, but we should try to not be so open-minded that our brains fall out. In other words, we should always be open to new ideas, but if those ideas are not supported by solid evidence, we should reject them. What’s interesting is that Person A presents their “argument” again in a faulty and circular fashion, but it still allows them to reduce their cognitive dissonance enough to avoid serious reflection on their position.

Now, some of this is education. I don’t think our education system doesn’t do a great job of teaching skepticism and helping people understand what is a good source vs a bad source.

But I think (and what I’ve been studying academically) is that many times people are motivated to reject information that conflicts with their worldview. Their worldview and identity is conflated with how they see themselves. So they find conflicting information as an actual attack on their self-worth, which makes them engage their defensive biases. When such defensive biases are engaged, people are likely to maintain or even strengthen their previously held beliefs despite being exposing to conflicting information.  This phenomenon is called the “back-fire effect” if you’d like to read more about it.

With those defensive biases engaged, such circular arguments like in my example don’t seem as bad because they are enough to protect one’s worldview. Through motivated reasoning, we are biased to pay attention to information that confirms our worldview and reject those that are not consistent with it. That is why it can be so tricky to reason with people who are engaged in the back-fire effect. They are motivated to accept information that is consistent with their position, even if that information is rather flawed.

My example here is rather simplistic, but this psychological process happens to everyone and can be harder to detect when it’s more subtle. To mitigate this effect, we need to separate one’s identity from the information first and then get them to reflect. But doing that is difficult and is something we are still trying to figure it out in the social psychology world!

There is some research suggesting that boosting one’s sense of self (called self-affirmation in the literature) can actually make people more open to identity threatening information. This makes sense if people’s self-worth is actually attacked when they are exposed to identity threatening information. For example, having someone write about their important values can make them more open to seeing the other side of a political argument. In a future blog post, I’d like to describe this process in more detail. I’m biased because it’s what I’m studying for my PhD, but I think it’s a fascinating research area!

[Featured image from David Goehrin under Creative Commons 2.0]

Browse Our Archives

Follow Us!

What Are Your Thoughts?leave a comment