In part 1 of this series, I argued that we should have a goal to seek the truth at all times and then follow it!
After all, if we believe that Jesus is the Truth (John 14:6), we have nothing to fear. All truth will only point us to Christ! –which is the goal of the Christian life: as Paul says, “We proclaim Him, admonishing every man and teaching every man with all wisdom, so that we may present every man complete in Christ” (Col 1:28)
We would like to think that most people, especially ourselves, are open and honest people. The more information that we have the better decisions we will make. What I have been contending in this series of posts is that more information doesn’t always help! In fact, it tends to make us more dogmatic and more entrenched in our convictions.
This is shameful. And, as I will contend in part 4 of this series, it does great harm to our Christian witness.
How come more knowledge isn’t necessarily helpful?
In part 1 of this series, I noted that we have a heart issue: “The heart is more deceitful than all else and is desperately sick” (Jer 17:9). This keeps us from being open-minded.
In part 2 of this series, I continued the discussion by noting our pride hinders us from being objective and honest evaluators of information. We all have trouble admitting when we are wrong.
Furthermore, I also noted that we deny the truth because we are too often unwilling to change our behavior.
In this post, I wish to look more deeply at the scientific data that supports the inference that simply having more information does not always lead us to the truth.
We are all biased
We are all biased. Our natural tendency is to look at the world through our own personal bias. For example, if I believe that person A is a good person, I will be more inclined to assume that his/her motives were good in a given situation. And I will tend to filter out contrary evidence.
This can, of course, be overcome. People do change their beliefs and their ways.
According to the book of Acts, many of the early converts to Christianity were priests and leaders in Jerusalem (Acts 6:7). This is incredible when one thinks about it.
According to the biblical account, many of the religious leaders were opposed to Jesus and His claims. Many of them even sought His death. Why, then, would those who were so opposed to Jesus while He was alive, suddenly become His followers after His death? When one considers the risk that such persons were taking in coming to Christ, it only adds to the incredulity of their coming to a belief in Christianity.
Too often, however, this sort of transformation marks the exception rather than the rule. Most people are unwilling to look at information objectively.
Motivated reasoning
One of the reasons why we do not look at facts objectively is what is known in the field of cognitive science as “motivated reasoning.”
Motivated reasoning is very evident when it comes to sporting events. Was the referee’s decision correct? We might want to believe that with the advent of video review, that, at least in most instances, there would be an objective answer to the question (there will always be a measure of subjectivity in some sporting decisions).
We all know, however, that, most often, the answer to “was the referee’s decision correct” is determined by whether or not the referee’s call benefitted my team.
Well, I hate to break it to you, it ain’t just sports in which people are not objective. It is most of life!
Just last night I was watching a global newscast (BBC) in which there was a report on human rights violations by the Chinese government against the Uyghurs—a minority Muslim population that has suffered greatly for many decades in western China. The report showed “verified” video evidence of the severe mistreatment of Uyghurs. The Chinese ambassador to the UK, who was a guest on the program, was asked about the situation. His response: he simply denied the veracity of the video footage. He didn’t address the question at all, because he rejected the evidence.
Perhaps the leading researcher on motivated reasoning is Dan Kahan of Yale law school.[1] The premise of his research has confirmed that oftentimes the more information we have the less objective and the more entrenched we become.
Ezra Klein, reporting on the research of Kahan and his co-authors, Ellen Peters, Erica Cantrell Dawson, and Paul Slovic, noted that “The More Information Hypothesis isn’t just wrong. It’s backwards. Cutting-edge research shows that the more information partisans get, the deeper their disagreements become.”[2]
Kahan and his colleagues manufactured a setting to discern how personal preference influences one’s objectivity.
They began by presenting individuals with data relating to a skin cream. The participants were to determine if the skin cream was effective or not at curing a rash. Essentially, all one had to do to determine the effectiveness of the skin cream was to solve a simple math equation. In the data, there were two groups of people:
- Group A were given the cream and applied it to their rash. Some of them got better, while a certain number got worse.
- Group B were not given the cream. Some of them got better, while a certain number of them got worse.
In order to discern if the cream was effective, the respondents simply had to compare the numbers.
The data they were given showed that while twice as many people who received the cream got better compared to those who did not receive the cream, there were three times as many who received the cream who got worse compared to those who did not receive the cream. Simple math tells us that the cream was not an effective product. Two times as many people may have been helped, but three times as many people were made worse.
Kahan noted that the respondents’ decision to determine the effectiveness of the cream was directly correlated to their mathematical abilities.
Motivated reasoning applied to politicized issues
Kahan then took the exact same figures used in the cream scenario (twice as many got better but three times as many got worse) and applied them to a politicized problem; such as gun control or climate change.
He noted that suddenly the respondent’s math skills seemed to matter less. Instead, their ideology had a significant impact on their ability to look at the data. That is, people’s math skills were no longer the most significant factor as to whether or not they were able to analyze the data correctly.[3] Instead, they often chose to read the data in a manner that confirmed their prior convictions.
When it came to the skin cream, since they had no prior convictions, and since they did particularly care if the cream worked or not, most respondents were able to analyze the data and use basic math skills to determine if the cream was effective.
However, when it came to climate change or gun control, all bets were off. Most respondents began the analysis with a predisposition for or against the proposition and, consequently, chose to analyze the data in a manner that confirmed their prior convictions!
Why does this matter?
In part 4, I will take up this question more fully. For now, I’ll mention two significant reasons:
-
Our growth into the likeness of Christ correlates to some extent with the knowledge of the truth.
-
The witness of God’s people is at stake.
[1]https://scholar.google.com/scholar?q=dan+kahan+motivated+reasoning&hl=en&as_sdt=0&as_vis=1&oi=scholart.
[2] https://www.vox.com/2014/4/6/5556462/brain-dead-how-politics-makes-us-stupid. Last cited June 5, 2020.
[3] For a TedTalk presentation of Kahan on this matter see: https://www.youtube.com/watch?v=1KFtQV7SiII