Social Group and Moral Decisions

Social Group and Moral Decisions June 13, 2012

Post by Ann F-R.

This is post #4 on The Righteous Mind: Why Good People are Divided by Politics and Religion, by Jonathan Haidt. We face a dilemma: studies evidence that our moral thinking aims more to justify our choices than to guide us into virtuous paths. We would rather be thought of as virtuous by others, than actually be virtuous in deed. We would rather be confirmed in our predispositions, behaviors & judgments than alter them when confronted with truth, facts, outcomes or ethics. Our reputation within our group matters more to us than whether we act and behave righteously, whether our joint beliefs are verifiable, constructive or shaky in reality. This chapter induces squirms of discomfort, when we’re being honest! The good news: we can implement methods and practices to diminish and check these tendencies within ourselves by becoming more accountable.

Here’s a way of detecting “reputation” at work in our evaluating: Let’s say Brian McLaren writes something … or Al Mohler … or Jim Wallis … or you see something on The Gospel Coalition website … or something from Tom Wright … or from Mark Driscoll… How do you respond? Do you react to the person, regardless of what is said, or do you consider them someone worthy of conversation? What kind of thinking do you use?

Haidt on accountability: “the most important principle for designing an ethical society is to make sure that everyone’s reputation is on the line all the time, so that bad behavior will always bring bad consequences.” (p.73) Our political scene reflects the worst of us. Accountability could serve as our central emphasis because our human patterns make it imperative. We need the “’explicit expectation that one will be called upon to justify one’s beliefs, feelings, or actions to others…’” If liars, slackers and cheaters experience no consequences, systems break down. (pp. 74-5) Exploratory & confirmatory thought processes differ, markedly. “Exploratory thought is an ‘evenhanded consideration of alternative points of view.’ Confirmatory thought is ‘a one-sided attempt to rationalize a particular point of view.’” (p.75)

Accountability reinforces exploratory thought, only under 3 conditions: 1) decision makers know they’ll be held accountable to others, 2) whose perspectives are unknown, and 3) who are committed to being well-informed and accurate. If all 3 conditions aren’t met, “which is almost all of the time—accountability pressures simply increase confirmatory thought. People are trying harder to look right than to be right.” (p.75-6)

Do we feel drawn to follow along – in families, churches, offices or politics – because following seems rewarding, safe, self-affirming or comforting of fears? Do financial and political periods of fear or euphoria exaggerate these tendencies?  Consider 2 Corinthians 4, instead.

We are obsessed with polls because self-esteem is directly connected to how others perceive us. The plus: seeing ourselves reflected in others’ eyes can encourage healthy changes. Conversely, the same reflection drives some folk to seek people who confirm and bolster false beliefs, poor choices, and untrue versions of reality. The latter reveals the phenomenon of “confirmation bias”, which describes our tendency to seek supportive evidence while ignoring or discounting contrary evidence. (p.79) The higher the education level & IQ, the more prolific & complex the arguments affirming our bias: “people invest their IQ in buttressing their own case rather than in exploring the entire issue more fully and evenhandedly.” (p.81)

When we forthrightly face what studies (& Scripture!) reveal about ourselves, we should squirm. Instead, we search for plausible deniability – avoidance of directly admitting lies, deceit or misplaced pride – because it’s more comfortable not to be “too” ethical or moral, and to cheat when there’s no visibility. The majority of people cheat, but only up to the point where they could no longer find a justification that would preserve their belief in their own honesty. (p.83)   “If the [elephant’s] rider were in charge of ethical behavior, then there would be a big correlation between people’s moral reasoning and their moral behavior. But he’s not, so there isn’t.” (p.82)The bottom line is that in lab experiments that give people invisibility combined with plausible deniability, most people cheat. …[Our] inner lawyer is so good at finding justifications that most of these cheaters leave the experiment as convinced of their own virtue as they were when they walked in.”  (p.83)

Has the church’s prophetic voice been raised for light and truth, uniformly? Where do we resist transparency and accountability?

“The difference between can and must is the key to understanding the profound effects of self-interest on reasoning.” (p.84) When we want to believe something (i.e., our elephant naturally leans that direction), then we look for justification that we can believe it. If we want to reject something as true, the rider searches for reasons to deny that we must believe it. “… if we find a single reason to doubt the claim, we can dismiss it. You only need one key to unlock the handcuffs of must.” (p.84) Haidt addressed non-scientists who deny scientific results. (Haidt’s work will handily be refuted by us, too, if we’re not careful to correct our elephantine discomfort which inclines us to reject it.) Our desire to be confirmed in beliefs is so strong that it affects even sensory perceptions. “for nonscientists, there is no such thing as a study you must believe. It’s always possible to question the methods, find an alternative interpretation of the data, or, if all else fails, question the honesty or ideology of the researchers. (p.84)

Facing the scholarship of scientists such as RJS, Peter Enns, Bruce Walton, & experts in other fields, have we discerned them as fellow truth-seekers where they are gifted, or justified dismissing their work, its implications, and them – even with ad hominem attacks?

 


Browse Our Archives