The Backfire Effect refers to a phenomenon where, upon being corrected regarding a piece of misinformation, someone will continue to believe that misinformation even more strongly. This is something I have brought up in the past as a reason why mere facts simply aren’t good enough for changing minds. We are often so deeply ingrained in our beliefs that contradictory evidence can cause us to dig in our heels even deeper. That being said, a paper published by Wood and Porter in December in Political Behavior seems to cast a lot of doubt on the idea.
This paper includes five separate studies using a sample of 10,100 people of various political affiliations recruited from Mechanical Turk, a site designed for crowdsourcing, producing large amounts of human data. Participants were exposed in various ways to specific political forms of misinformation and then a random number were given corrections, and then participants were asked to agree or disagree with survey items related to these statements and corrections. The studies included:
- Subjects were exposed to misinformation from a single partisan figure (like Ted Cruz or Barack Obama) that fit a certain ideological narrative. They were then randomly exposed to or not exposed to a correction from a neutral government source.
- Subjects were exposed to misinformation from a partisan figure that could be a Democrat or a Republican. The misinformation from either source could be corrected with the same statement, which was given to subjects at random.
- Subjects were asked to read a mock article with misinformation in it, where random participants read entries that contained a corrective paragraph with information from a neutral source.
- Subjects were given a mock article with randomized corrections, and then asked how much they agree with survey items that are worded in various degrees of complexity.
- Similar studies on a smaller scale comparing subjects’ response on the Mechanical Turk platform to Lucid, a more nationally representative survey platform.
Within the tests that this survey performed, the study found surprisingly no sign of backfire effect among the participants. Upon correction, participants were more likely to revise their position and less likely to agree with misinformation regardless of ideological bent.
One of the original papers on the Backfire Effect (Nyhan and Reifer, 2010) is discussed in the paper. This more recent study tries to replicate its results, using Mechanical Turk to crowdsource participants instead of the 2010 study’s sample of college students (a population sample often used for convenience in psychological studies). The 2017 study proposes that the survey in the 2010 study was unnecessarily wordy and complicated, which may have confused participants and induced a backfire effect. The fourth study in the 2017 paper analyzes survey results as a function of complexity, where they changed the level of detail participants were asked to agree with on a 5-point scale.
The original Nyhan and Reifer paper analyzed the backfire effect regarding the claim that there were Weapons of Mass Destruction (WMDs) in Iraq. They were asked to agree on a 5-point scale with the following statement:
“Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.”
Wood and Porter claim that there are so many facts in this statement that survey-takers might be overwhelmed and retreat to their preconcieved notions. They compared their own responses to the previous survey question to the following survey question:
“Following the US invasion of Iraq in 2003, US forces did not find weapons of mass destruction.”
In changing the survey question, respondents were a lot more likely to correct their understanding that there were, in fact, no WMDs found in Iraq. According to Wood and Porter, liberals adopted the factual correction to WMDs when given factual information.According to the paper’s summary,
We find that backfire is stubbornly difficult to induce, and is thus unlikely to be a characteristic of the public’s relationship to factual information. Overwhelmingly, when presented with factual information that corrects politicians–even when the politician is an ally–the average subject accedes to the correction and distances himself from the inaccurate claim.
The paper recognizes the incongruity between the findings of the 2010 paper by Nyhan and Reifer and these studies. They hypothesize that this is a sampling issue, and that college students do not represent the national whole in the United States. People are predisposed to avoiding cognitive effort, and cognitive effort is required in developing counterarguments to corrections.
When someone develops a counterargument to a statement, they may feel more reassured in their beliefs feeling that their stance has stood up to some scrutiny, even if they happen to be wrong. This is what presumably caused backfiring in corrections. Wood and Porter note that it’s well established that college students are more inclined to engage in cognitive effort compared to the general population, making them more susceptible to this particular phenomenon.
This does have some interesting implications. Often in online and public arguments, folks will tend to maintain their stance throughout the argument. Perhaps their minds are changed, but they must continue posturing to save face. Backing down in the face of correction, right or wrong, often makes us feel sheepish or foolish for saying something incorrect in the first place. It’s very difficult socially to back down from something we’ve already developed a stance on. This effect could easily amplified for people who have a much larger audience like celebrities and politicians, as they have far more eyes on them that are likely to notice when they back down from a position. It may be difficult for a politician to issue a correction, as it weakens the image of their political platform (as well as their party).
As such, taking a stance in a conversation with someone else may be vastly different compared to taking a survey. If someone is wrong and learns that they are wrong in a private setting, it’s much easier for them to agree with the correction, since nobody has witnessed them saying anything wrong. I suspect that the results here would differ from a study where, for example, subjects could be in a discussion with someone else who disagrees with them. If a test subject’s position is based on misinformation and that is corrected in front of another person, they may be less likely to back down compared to a private survey.
Ultimately, while this does seem like a thorough study and the numbers check out, I have to say that the backfire effect definitely exists as a strong phenomenon, and I’m more certain than ever at this point.
I kid, of course.
I have brought up the backfire effect in the past, and now evidence points less strongly to its relevance. Of course, no one study is definitive, but this does call into question how reliably we can say it exists. That being said, I maintain that merely presenting bare facts is not always an effective way of changing hearts and minds on divisive issues.
For example, even this study seems to recognize that who we get information from matters, as subjects more readily accepted corrections against figures they disagreed with compared to politicians on their side. We also know that social pressure is a major factor in how we develop our worldview. it could be the case that after the study some of these subjects retreated back to their misinformation sources they trusted and reverted their positions back to their misinformed view.
Regardless of what causes people to change their positions, we should be a lot more cautious in bringing up the backfire effect when discussing the best ways to inform and educate people.
Featured Image from Pixabay.