Evil: Bad Apples or Bad Barrels?

Evil: Bad Apples or Bad Barrels? April 29, 2016

Historically, one of the weaknesses of liberal religious traditions has been a naïve optimism. In rejecting the extreme pessimism of many orthodox religious traditions (a belief in “Original Sin,” the “total depravity” of human beings, and the corrupt nature of the world and society), many progressives  overestimated the perfectibility of human nature, the possibility of building utopian societies, and the inevitability of progress “onward and upward forever.” Whereas many orthodox religious traditions have overestimated “evil” as literally real, liberal religious traditions have underestimated “evil” as something that can be overcome through human reason.

As the Russian writer Aleksandr Solzhenitsyn (1918 – 2008) wrote in The Gulag Archipelago:

If only it were all so simple! If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being…. During the life of any heart this line keeps changing place; sometimes it is squeezed one way by exuberant evil and sometimes it shifts to allow enough space for good to flourish. One and the same human being is, at various ages and various circumstances, a totally different human being. At times…close to being a devil, at times to sainthood…. From good to evil is one quaver, says the proverb. And correspondingly, from evil to good. (75)

When I look into my own heart, I know this to be true.

When I try to define evil (or any other such heavily freighted word), I try to keep in mind that the words in our human language are imperfect tools for communication that have evolved in meaning over time for contingent historical reasons. From my own tradition of Unitarian Universalism, one of our Unitarian forebears Cyrus Bartol quipped in his 1872 book, Radical Problems, “I spell my God with two o’s and my devil without a d” (103). Instead of a literal, metaphysical Devil, early humanists removed the initial “D” and simply called some actions “evil.” And instead of a literal, metaphysical God, they added a “o” and simply called some actions “good.”

ShockedTheWorldTo explore more deeply, I would like to invite you to consider the social psychologist Stanley Milgram’s infamous Obedience Experiments. (I highly recommend both the 2015 film The Experimenter about Milgram as well as the book about him, titled The Man Who Shocked the World.) Stanley Milgram (1933 – 1984) was the son of Jewish immigrants from Eastern Europe (1). Both of his parents’ families arrived in this country well before World War II, but his Jewish heritage contributed to his interest in understanding how human beings could end up participating in genocide.

Another influence on how Milgram designed his work was a previous study on independence and conformity by one of his mentors, Solomon Asch. Asch’s most famous experiment invited a volunteer test subject to sit at a table with seven other participants. Allegedly, the study was about perceptual judgement, but here’s the twist: the initial seven people sitting at the table were all actually working with Asch and following a script.

Asch would stand in front of all eight participants and hold up different sets of four vertical lines. Each person, in turn, had to say aloud which of the other three vertical lines matched the first line. There were eighteen sets of lines in total. The initial seven participants always went first, leaving the real test subject for last.

On the first few rounds, the experiment seemed absurdly easy. But after choosing the correct match the first few times, the seven participants began to purposefully choose an incorrect line on a predetermined twelve of the eighteen trials. As person after person would choose the same incorrect line, the real test subject was faced with a growing dilemma: should I trust my own perceptions or go along with what everyone else has said? Resisting peer pressure is hard when seven people have each said confidently that an answer—that clearly seems wrong to you—is right. And it turns out that, “Subjects went along with the bogus majority’s answers about a third of the time” (28). Keep that result in mind: about a third of the time, people would rather conform to a majority (that they are pretty sure is wrong) rather than risk going out on a limb to support an unpopular position they are pretty sure is actually correct.

Milgram’s own obedience experiments were conducted starting in 1961 in New Haven, Connecticut. They were advertised as a study of memory. Participants were paid $4.50, which adjusted for inflation, would be approximately $36 today (75). The money was paid at the beginning, and participants were assured that they could keep the money regardless of what happened to be sure that participants did not feel coerced to continue in order to be paid (77).

Citizens of New Haven who responded to the ad ranged from twenty to fifty years old (76). Similar to Asch’s conformity study, participants in Milgram’s Obedience Experiments were led to believe that there were two subjects being studied when in fact everyone else was part of the experiment team. Both “participants” were asked to draw a slip of paper to determine their roles in the “memory study.” They were told that one says “Teacher” and the other “Learner.” However, both slips of paper actually said “Teacher,” and the fake participant would pretend that the slip said, “Learner” (78).

The Teacher was then led to a box that said “Shock Generator” with a series of switches that ran from 15 volts to 450 volts in 15-volt increments. There were additional labels that grouped every four switches into a progression that read from left to right, “Slight Shock, Moderate Shock, Strong Shock, Very Strong Shock, Intense Shock, Extreme Intensity Shock, Danger: Severe Shock. The last two switches were simply and ominously labeled XXX” (79).

The “Teacher” was instructed to administer a series of memory tests through a microphone to the “Learner,” who was seated in the next room out of sight behind a closed door. Whenever the “Learner” answered incorrectly, the “Teacher” was told to flip the next switch, administering a 15-volt higher shock each time. The machine did not actually shock the person in the next room, but it appeared to through a realistic series of lights, meters, and buzzing sounds on the machine.

The preplanned script ensured a series of wrong answers. And the “Teacher” was instructed to continue administering shocks in the face of increasingly pitiful and painful pleas from the “Learner” that had been prerecorded and set to play on cue from the next room when various voltage switches were pressed. The “Teacher” was also given an actual shock of 45 volts at the beginning as a point of reference. The final shock label was 450 volts, ten times that initial amount. Disturbingly, “65 percent of the subjects continued to obey the experimenter to the end, simply because he commanded them to do so” (xvii). Many questioned whether they should continue, but when assured by the scientist in the white coat that it was vital to the experiment to continue, a significant majority set aside their conscience and better judgment and submitted to the instruction of the authority figure.

Milgram’s experiment have been repeated by other scientists to show similar results (294-5). These studies consistently show:

with stunning clarity that ordinary individuals could be induced by an authority figure to act destructively, even in the absence of physical coercion, and that it didn’t take evil or aberrant individuals to carry out actions that were immoral and inhumane…. While one might think that when confronted with a moral dilemma we will act as our conscience dictates…[but] in a concrete situation containing powerful social pressures, our moral sense can readily get trampled underfoot. (xviii)

Prior to his Obedience Experiment, Milgram wrote that he did not think it would be possible—even given the much larger population of the United States—to find enough “moral imbeciles” to staff the number of death camps that were built in Germany during the Holocaust. After the Obedience Experiments, he wrote, “I am now beginning to think that the full complement could be recruited in New Haven. A substantial proportion of people do what they are told to do, irrespective of the content of the act, and without pangs of conscience, so long as they perceive that the command comes from a legitimate authority” (100).

The political theorist Hannah Arendt called this phenomenon “the banality of evil” in her book Eichmann in Jerusalem. In watching the trial of the Nazi leader Adolf Eichmann, she became convinced that Eichmann was not a uniquely “sadistic monster”; rather, the truth was much more disturbing: Eichmann was “an uninspired bureaucrat who simply sat at his desk and did his job” (269).

Relatedly, Nazi records have also shown, parallel to the psychological studies of conformity and obedience, that when orders were given to begin large-scale executions, in some cases fifty percent or so of soldiers would refuse to obey the orders at first, and would allow the other half to carry out the heinous acts. But over time, increasing numbers would participate, moving up toward ninety percent (Zimbardo 285-286).

From a more hopeful perspective, the opposite is also the case: one dissenter can inspire a cascade of rebellion against an unjust authority. But you need that crucial first person with the bravery to protest as well as the essential second and third person (and more!) to step up in solidarity in order to create a potential domino effect. Indeed, Milgram ran a variation in which there were three “Teachers” instead (two were actors). When the fake Teachers defied the experimenters and refused to continue—“one at 150 volts, the other at 210 volts—…90% of the naive subjects followed their example and dropped out at some point before [reaching 450 volts]…. No other variation Milgram conducted was as effective in undercutting the power of authority” as someone modeling dissent (108).

A related example is the TED Talk How to Start a Movement featuring “Shirtless Dancing Guy.” The video illustrates how:

      • The First Follower transforms the lone nut into a leader.

      • The Second Follower transforms two lone nuts into a crowd — and a crowd is news.

      • Momentum: as an increasing numbers of people come forward, there is increasingly less risk of standing out or being ridiculed.

      • Tipping Point: At a certain point, if you don’t hurry, you will miss out on being be part of the “in crowd” who was up there dancing first.

      • If you wait too long, you can be made fun for not stepping up and dancing.

If there were more time, I would love to give you more details as well as the further findings such as from the Philip Zimbardo’s Stanford Prison Experiment. (For details, I highly recommend Philip Zimbardo’s book The Lucifer Effect: Understanding How Good People Turn Evil.) To limit myself to only a few highlights of this study, average citizens were recruited off the streets of Palo Alto, California in 1971 to be part of a two-week prison experiment. But the experiment was canceled prematurely after only a few days because it became too dangerous, volatile, and psychologically harmful. When divided at random into the roles of prisoners and guards, these average U.S. citizens—even knowing that it was all an experiment— began almost immediately to inhabit their roles in abusive and disturbing ways.

Zimbardo said that the most essential lesson that emerged from the prison experiment was the power of systems on influencing our behavior. His important metaphor is that too often we blame “a few bad apples,” whereas the real problem is that “pretty good apples” are put into a “bad barrel” that causes them to spoil quickly. It is really the barrel—the system—that needs to change (Zimbardo 331).

Along these lines, one of the reasons I have chosen to be a part of the Unitarian Universalist movement is that we come from a long line of heretics who can help inspire the challenging of corrupt systems and unjust authorities. (You can likely think of examples from other traditions as well.) The word heretic simply means “to choose”—that is, to choose for oneself what to do or believe instead of what is chosen for you by someone else:

  • In 1553, our Unitarian forebear Michael Servetus was martyred—burned at the stake—for questioning the doctrine of the Trinity and the necessity of infant baptism, but his memory has been an inspiration to others ever since for courageously questioning unreasonable belief systems.
  • In 1863, our Universalist forebear Olympia Brown became the first woman to be ordained with full denominational authority: “She worked for women’s right to vote from girlhood and finally was able to cast a ballot in 1920 at age 85.”
  • In 1859, as part of the abolitionist movement against slavery in this country, out of the Secret Six who helped fund and supply John Brown’s raid on the federal armory at Harpers’ Ferry, five were Unitarians, two were Unitarian ministers.
  • During the Vietnam era, when Daniel Ellsberg was looking for someone to publish the Pentagon Papers, the only publisher brave enough was our UU Beacon Press. In retrospect, this choice was a courageous act of conscience and civil disobedience, but at the time Beacon Press found itself in “a spiral of two-and-a-half years of harassment, intimidation, near bankruptcy and the possibility of criminal prosecution.”

There are so many more examples I could give. But the lesson is that although there are individual evil acts, in the words of Bryan Stevenson,Each of us is more than the worst thing we’ve ever done.” Moreover, for the most part, both good and evil acts are strongly shaped by the institutions and the systems in which we find ourselves. I know that being part of a progressive congregation locally, challenges me to be better than I would be otherwise, to listen to my conscience more closely, to be much more willing and likely to make the first move against injustice — or to stand in solidarity with others who have made an act of resistance or protest.

As Dr. King said after studying the work of our UU forebear Henry David Thoreau, “I became convinced that noncooperation with evil is as much a moral obligation as is cooperation with good.”

So, stand up, speak up when you see someone violating “The inherent worth and dignity of every person.”

Stand up, speak up when your conscience is tugging at your heart—and you know you need to act.

Stand up, speak up that together we might build a world with peace, liberty, and justice—not just for some—but for all.

 

The Rev. Dr. Carl Gregg is a trained spiritual director, a D.Min. graduate of San Francisco Theological Seminary, and the minister of the Unitarian Universalist Congregation of Frederick, Maryland. Follow him on Facebook (facebook.com/carlgregg) and Twitter (@carlgregg).

Learn more about Unitarian Universalism: http://www.uua.org/beliefs/principles


Browse Our Archives