The Milgram Obedience Experiment

Stanley Milgram. “Behavioral study of obedience.” Journal of Abnormal and Social Psychology, vol. 67, no.4 (1963), p.371-378.

In the 1960s, Stanley Milgram conducted one of the most important experiments ever done in the field of human psychology and social conformity. For ethical reasons, this study probably could not be repeated today, but that only makes it even more important to raise awareness of its findings.

Milgram, a psychology professor at Yale University, recruited 40 male subjects of diverse occupations and educational levels from the surrounding area. When they arrived at the laboratory, he told them that they would be participating in a study on the role of punishment in learning, and that they might be either the “teacher” or the “learner”, based on a random draw. In fact, the draw was rigged. The participant was always the “teacher”, while the “learner”, who was introduced as a fellow participant, was secretly a confederate of Milgram’s.

The two participants were introduced. Then, in view of the teacher, the learner was strapped to a chair and had an electrode placed on his wrist. The teacher was then ushered into an adjacent room and was shown what they were told was an electric shock generator. It had 30 clearly marked levels, beginning at 15 volts and proceeding by 15-volt increments to a maximum of 450 volts. Each group of four switches was also given a verbal label, ranging from “Slight Shock” at 15 volts to “Danger: Severe Shock” at 400 volts. The two highest levels were simply labeled “XXX”. The subject was told that the generator was connected to the electrode on the learner’s wrist. In reality, the generator was a dummy, wired to flash a light and move a voltmeter when the button was pressed, but otherwise do nothing. The subject was also told that the shocks could be extremely painful, but could not cause any permanent tissue damage or injury.

The experiment was a simple exercise in matching words on a list, with the learner signaling an answer via switches that lit up one of four lights on a display in the teacher’s room. Every time a wrong answer was given, the teacher was instructed to give the learner a shock and move the machine’s intensity setting up by one level. The experiment was designed so that the teacher would have the opportunity to proceed through the full range of shocks. As prearranged by Milgram, upon reaching the 300-volt level, the learner would pound on the wall separating the two participants, and from that point on would no longer answer the questions. The subject was instructed by the experimenter, who remained in the room with him, to treat the absence of a response as a wrong answer and to continue with the experiment. If the teacher objected, the experimenter answered from a fixed list of replies, such as, “Please continue,” “The experiment requires that you continue,” or, “You have no choice, you must continue.” Only if these prods could not persuade the subject to obey was the experiment terminated before reaching the highest shock setting.

Before the study was run, 14 Yale psychology majors to whom the experiment was described in advance predicted that only between 0 and 3% of subjects would obey the experimenter all the way through to the end and administer the most potent shock. In fact, of the forty subjects, twenty-six – an astonishing 65% – went all the way through to the end, administering what they had every reason to believe were dangerous shocks to a participant who, by that point, had clearly expressed a wish to stop and had subsequently become unresponsive.

As the experiment proceeded, many of the subjects exhibited signs of extreme stress, sweating, trembling, biting their lips and digging their fingernails into their palms. Some of them expressed concern about the learner or denounced the experiment as stupid, senseless or crazy. Yet they still continued to obey the experimenter and administer the shocks.

From the subject’s perspective, after the 300-volt level, it was a reasonable inference that the learner’s failure to answer any further was because he had become incapacitated and was at serious risk of injury or death if the experiment were to continue. Yet 26 out of 40 subjects continued on regardless, pressing the button at the experimenter’s command to deliver shocks of up to 450 volts (the “XXX” level) to a person who had been unresponsive for as many as ten straight answers. Only 14 of the 40 disobeyed the experimenter’s commands and terminated the experiment at any point before reaching the end.

The Milgram study teaches us much about the dark side of human psychology: the ease with which we come under the sway of authority, and the willingness of many people to suspend ordinary standards of morality and conscience when ordered to do so and hand over the responsibility for their actions to another. Even ordinary, ethical people, who in most imaginable circumstances would never dream of harming an innocent stranger, seem disturbingly susceptible to this flaw.

It is not hard to see in this experiment echoes of the Nazi foot soldiers who committed unspeakable crimes and then claimed, when brought to justice, that they were “just following orders” – the so-called Nuremberg Defense. And ironically, this study shows that those claims may well be true. As much as we might like to believe that those who commit such evil deeds have some intrinsic character flaw, some fundamental defect that makes them not like us, the truth is that acts of great evil can be committed by seemingly normal, ordinary people. Of course, we naturally want to deny this lesson because it means that we, ourselves, might also be capable of such acts under the right circumstances, and this is a disquieting conclusion.

However, even if the Nazis’ claim that they were just following their leaders’ orders was true, it does not excuse what they and others like them did. Morality would be thoroughly worthless if we exempted people from its dictates whenever they fell short due to human fallibility. Instead, its purpose is as a standard for us to live up to, a counterweight to the blindness of obedience to authority. And, do not forget, some people did refuse to obey and terminated the experiment early. Although authority can be a powerful influence on us, we are not helpless against it.

More importantly, by simply being aware of the Milgram study and its implications for our behavior, we can change that behavior and more effectively resist the undesirable tendencies bred into us by evolution. Knowledge of experiments like Milgram’s is itself a causal factor that can influence people’s actions and cause them to choose differently than they otherwise would have. Who, knowing about this study, would not think twice if they ever found themselves in a similar situation? This fits in with what I have said previously: merely by studying and learning about our limitations, we gain the tools to overcome those limitations and become more rational and more responsible human beings.

About Adam Lee

Adam Lee is an atheist writer and speaker living in New York City. His new novel, Broken Ring, is available in paperback and e-book. Read his full bio, or follow him on Twitter.

  • Dave

    I think the real tie-in here, and one that is the subject of the new book “The Lucifer Effect” by the professor that ran the Stanford Prison Experiment, is that concepts borne of superstitious and propogated in the modern day by religion such as that certain individuals are inherently “evil”, get in the way of social and individual healing. Evil is a very real thing: it is the effect of people’s actions that causes destruction and suffering in our world. This includes religiously-motivated “honor killings” just as much as it does mentally imbalanced serial killers.

    Punishment of crimes is important because knowledge of the punishment is one of many environmental effects that can cause or prevent individuals from “doing evil”, but to minimize the importance of rehabilition and amelioration of the root causes of a behavior is to blind ourselves to the nature of evil itself.

  • Polly

    That experiment is always in the back of my mind. I hope that if and when my character may be tested, I’ll remember to listen to my conscience. I am by nature, an easy-going fellow so I am susceptible to being too “cooperative.”
    Forewarned is forearmed!

    I’ll confess, I was really, kinda hoping in the back of my mind that you were going to cap the post by revealing some little known fact indicating that atheists were primarily among the non-shockers. :)

  • Ebonmuse

    You know, I never even thought of that potential twist to the experiment. But now that you mention it, I want someone to try that and see how it works out! (And we do have a poll that touches on a similar topic, with results you might have expected.)

  • valhar2000

    I read somewhere that, in spite of the many criticisms that the ethics of this experiment received, the participants were generally happy, even grateful, for having taken part.

    It seems that they themselves learned how easily they can be manipulated, and, as Adam suggested, resolved to be more vigilant in the future.

  • Kullervo

    What experiments like this one and the Standord Prison experiment mean is that Evil is not some gene, some trait that some people have and some people don’t. All of us, no matter how uncomfortable it makes us, have an immense capacity for evil.

    When we look at atrocities like the Holocaust, the question should never be “how could they do such horrible things- how could they be such monsters.” Instead, the question should always be “how is it that we can do such horrible things.”

  • Wedge

    I once had a discussion about WWII with someone who loudly denounced the German population for ‘not standing up to what was going on.’ The ironic part was my acquaintance’s recent support of what he knew was an unsafe, potentially lethal practice at work. He refused to protest or even note his objections because of his perception that it might damage his prospects at work or relationship with his boss. Even that, I think, was an ad hoc rationalization; it was just as possible that it would win him respect and notice. He did not want to challenge authority.

    This study came immediately to mind. I’m glad to see you bringing it up. Most people believe they do the right thing, that they will always know when it is the right time to stop being obedient.

    It is always time. We always have to be thinking about what we’re doing, why we’re doing it, when to speak up, when to stop. The value of skeptical thinking is that it is the only approach that teaches, consistently, the hard burden of moral responsibility.

  • Christopher

    All this experiment shows is that human beings are still herd animals at the core: given enough social pressure, they will do just about anything.

    So much for that bogus concept of “free will…”

  • Chris

    Who, knowing about this study, would not think twice if they ever found themselves in a similar situation?

    Of course, lots of people like to think that they would. But short of repeating the experiment, how can we know for sure? They might be fooling themselves just as much as the original researchers who expected very few people to obey.

    I’m sure the original subjects would have predicted that they would never do something they thought was wrong just because an authority figure told them to. But they did exactly that.

    Keeping in mind the limitations of anecdotes, I think Wedge’s illustrates this point pretty well. Even with the example of WWII Germany in front of him, that person displays similar blind deference to the authorities in his own life. I bet I can guess whether or not that person is standing up to what is going on *now* in *this* country (well, assuming that he is from the US, which I suppose I shouldn’t assume on the Internet.)

    Polly: I think that would be quite likely. Any atheist in the present-day US is already resisting considerable social pressure or they wouldn’t *be* an atheist in the present-day US. If they can resist one kind of social pressure it’s not that unlikely that they could resist another. But of course that’s only speculation without data (and may be self-serving bias, since as an atheist myself, of course I want to believe we’re more resistant to that kind of manipulation).

  • David

    My dad was one of the subjects. Mom was on the Yale faculty and read about it (she had no inside knowledge).
    He was one of the ones who refused to go all the way. He was not religious, although brought up Christian and with a mother who became Christian Scientist later in life. He was an only child.

  • SM

    It strikes me that one of the most rarely discussed flaws in human nature is willingness to obey illegitimate authority. The worst evil comes from evil institutions, not evil men, because the reach of any man is limited. Yet almost everything good we have depends upon cooperation (one reason that I have little sympathy for libertarians). Striking a balance between the extremes of tyranny and anarchy is a tough problem, although like most such problems Western societies mostly have adequate solutions at the moment. But we have scotched the snake, not killed it, as witness recent events in various English-speaking countries.

  • J

    This idea is very true of Christian belief. In a number of cases during the First Crusade (c. 1096) the laity that actually fought the battles in Jerusalem did not clash swords in the name of God (although the nobility commanded through God) they did it for one major reason. 1) Because they were indebted to their feudal Lords and obeyed their commands. Good English peasantry went against their ethical beliefs and fought the Turks because they were told to.

  • Archi Medez

    One of the problems in attempting to analyze the effects of religious belief and membership in a religion is complexity; there are so many variables involved. Milgram’s study focussed on one of the important social psychological variables, obedience, which is important in any social organization of two or more people in which one demands something of the other. In particular, Milgram focussed on obedience in relation to the perceived (or believed) authority. Religious leaders have historically used the social psychological power of human authority, enhanced by the backing of an unassailable divine authority, in order to get people to obey with least resistance any rule or command, however arbitrary, irrational, or immoral. One can learn much about religion through the study of social psychology. Milgram’s study is an excellent example of this. There is also now a fairly well-developed body of research in social psychology dealing with religion. Indeed, the psychology of religion has emerged as a discipline in its own right.

    P.S. I believe Milgram had followed up this famous study with several others varying on the same theme.

  • Aydin Orstan

    The problem with soldiers is that they have to obey their orders. Otherwise, they will be punished and in the past could have even gotten executed especially during a war. In any army the responsibility is really with the top officers.

  • Kullervo

    Not so, Aydin. Soliders are absolutely supposed to refuse to obey unlawful orders- it’s a matter of national and international law. I can’t speak for other countries’ militaries, but I know that when I went through basic trainign and infantry school at Fort Benning, we were explicitly taught to say no to illegal orders- after clarifying exactly what you’re being to do, of course.

    Now, it should be said that there seems to be a lot of “wink-and-nod” that goes on, but at least in the US Military, a soldier who is accused of not following orders gets a court martial under the Uniform Code of Military Justice, where the overseeing tribunal will be most interested in the unlawful nature of said orders.

  • mike

    My social psychology class in college probably had the most impact on my life of any class (I was a computer science major). Simply knowing about these social tendencies can change your whole approach to tense situations. It also sheds light onto lots of the weird things we see every day.

    One of the things that always stuck with me from the social psych class was from a discussion of the Jonestown mass suicides. Our professor was absolutely positive that he would have drunk the poison had he been there. The point is that these social influences are instinctual, reflexive, and very very strong. None of us can pretend to be completely above them, not even research social psychologists who understand them better than anyone. We are certainly well-advised to be aware of these influences, but we cannot hope to win out over them all the time (I personally fall into the diffusion of responsibility trap more than I care to admit).

    So I don’t think it’s wise to rationalize or speculate about how athiests would perform better in the Milgram experiment, or any such experiment for that matter. I think any group of people who weren’t already familiar with the experiment (or something similar) would do just as poorly, regardless of their religious views.

  • The 327th Male

    Someone has done the Milgram experiment with religious belief as a focus.

    Thirty subjects selected from a college population were evaluated according to three religious beliefs’ scales. They were subsequently exposed to a modified version of Milgram’s (1963) procedure in which they were instructed to administer “shocks” to a victim for supposed “errors” on a learning task. Although it was hypothesized that persons scoring in the mid-range of religious scales would be less obedient than extremes, it was in fact found that moderate believers administered significantly more punishment than either the religious or nonreligious extremes.

    Hmmm. Perhaps people taking the middle ground of religious belief could be seen as just following the general consensus, and therefore be more susceptible to authority’s influence.

  • Ebonmuse

    Great link! Thanks for bringing that paper to my attention; I definitely have to track down and read the whole thing.

    That result doesn’t surprise me at all – I wrote about a similar principle last year, in “Smoothing Out the Rough Edges“. I would imagine that most “conventional” religious people are the ones who are most apt to follow authority, to take the path of least resistance, and so they’re most susceptible to commands like this. On the other hand, there’s an element of non-conformism in either excessive religiosity or in nonbelief, so we’d expect that they’d be more resistant to social pressure to conform.