I wrote recently about Jonathan Adler’s conservative and libertarian argument for taking climate change seriously and adopting policies to respond to it. Here’s a recent paper that supports the importance of framing an argument effectively in order to get people to respond positively to it. The bottom line is that evidence rarely actually changes anyone’s mind on a subject they feel strongly about; the thing that can change their mind is an argument that appeals to their values and preconceptions.
David Roberts explains the study’s findings. It is tempting to think that if those who reject global warming and the need to make changes to prevent it from getting worse were just exposed to the facts, they would change their minds. But this study pretty clearly disproves that assumption:
However intuitively plausible this answer might be, it suffers from one important flaw: It is wrong. Better educated people are not less likely to be skeptics. Greater scientific literacy and reasoning abilitydo not incline people toward climate realism. Where skepticism exists, additional information and arguments only serve to reinforceit.
This has been evident for some time, but a fascinating new study inNature backs it up with numbers. Yale researcher Dan Kahan and his colleagues tested the question directly: Is it true that greater numeracy and scientific literacy reduce polarization about climate science?
Kahan found that, among those with low scientific literacy, assessment of climate risk was high among “egalitarian communitarians” (those with a worldview “favoring less regimented forms of social organization and greater collective attention to individual needs”) and low among “hierarchical individualists” (those with a worldview “that ties authority to conspicuous social rankings and eschews collective interference with the decisions of individuals possessing such authority”).
So what happens as scientific literacy increases? The naive view — what Kahan calls the “science comprehension thesis,” or SCT — predicts that hierarchical individualists with high scientific literacy will more accurately perceive the risk and converge with egalitarian communitarians. But that’s not what happens…
As you can see, the SCT prediction is dead wrong — as science literacy and numeracy increase, polarization rises. Well-educated, carefully reasoning hierarchical individualists are less convinced of the danger of climate change.
The Kahan paper says:
The alternative explanation can be referred to as the cultural cognition thesis (CCT). CCT posits that individuals, as a result of a complex of psychological mechanisms, tend to form perceptions of societal risks that cohere with values characteristic of groups with which they identify. Whereas SCT emphasizes a conflict between scientists and the public, CCT stresses one between different segments of the public, whose members are motivated to fit their interpretations of scientific evidence to their competing cultural philosophies.
And that is why it’s important to speak to them in a way that triggers a positive response rather than a tribalistic shutting down. Roberts explains:
The operative concept here is “motivated reasoning.” The idea is, we begin by absorbing the values of our tribes — what is and isn’t important, what is and isn’t a risk — and use whatever numeracy and scientific literacy we possess to seek out facts and arguments that support those views. Getting smarter, in other words, only makes us better at justifying our own worldviews. It does not necessarily give us more scientifically accurate worldviews.
Kahan suggests a better way:
As citizens understandably tend to conform their beliefs about societal risk to beliefs that predominate among their peers, communicators should endeavor to create a deliberative climate in which accepting the best available science does not threaten any group’s values. Effective strategies include use of culturally diverse communicators, whose affinity with different communities enhances their credibility, and information-framing techniques that invest policy solutions with resonances congenial to diverse groups.
None of this requires being dishonest. It just requires speaking to people in a language they are likely to be open to, making the case on grounds that appeal to them rather than trigger an emotional, kneejerk reaction. That’s how effective coalitions are formed and it’s how people change their minds. We can sit on opposite sides of a wall screaming at each other and calling the other side crazy, stupid and dangerous, and for a lot of people we’d be right to do so. But if we want to actually change the minds of some of them, there’s a better strategy.