Reassessing Milgram’s Infamous Psychology Experiment

During the 1960s, American social psychologist Stanley Milgram conducted an experiment that would become one of the most infamous and influential in the history of psychology. Taking place within the recent memory of the Holocaust — indeed, Adolf Eichmann’s high-profile war crimes trial took place at the same time — Milgram’s study purportedly led to the disturbing conclusion that humans could be made to do evil things when commanded by an authority figure.

First, a breakdown on what the study entailed, courtesy of The Atlantic.

Over the next two years, hundreds of people showed up at Milgram’s lab for a learning and memory study that quickly turned into something else entirely. Under the watch of the experimenter, the volunteer—dubbed “the teacher”—would read out strings of words to his partner, “the learner”, who was hooked up to an electric-shock machine in the other room. Each time the learner made a mistake in repeating the words, the teacher was to deliver a shock of increasing intensity, starting at 15 volts (labeled “slight shock” on the machine) and going all the way up to 450 volts (“Danger: severe shock”). Some people, horrified at what they were being asked to do, stopped the experiment early, defying their supervisor’s urging to go on; others continued up to 450 volts, even as the learner [pleaded] for mercy, yelled a warning about his heart condition—and then fell alarmingly silent. In the most well-known variation of the experiment, a full 65 percent of people went all the way.

Until they emerged from the lab, the participants didn’t know that the shocks weren’t real, that the cries of pain were pre-recorded, and that the learner—railroad auditor Jim McDonough—was in on the whole thing, sitting alive and unharmed in the next room.

Just about every major atrocity can be accounted for by this finding, especially those taking place within an institutionalized or organized context: the My Lai massacre of the Vietnam War, the abuse and torture of Iraqi prisoners in Abu Ghraib, and every genocide that has ever taken place, from the Holocaust to Darfur.

All of these incidents — and the numerous unreported instances of government corruption, police brutality, and corporate malfeasance — involves a chain of command wherein perpetrators were directed to commit immoral acts they might otherwise not do on their own initiative. People have an instinct to follow anyone in a position of authority, even if it means violating the law or their moral code.

Needless to say, the implications are disquieting. Does that mean we are all just as likely to become moral monsters when told to by someone charismatic, persuasive, or powerful?  Is every society as likely to become a Nazi Germany if the cultural and political conditions are right? What does it say about human nature that we can be so easily corrupted?

Well, perhaps not much, for as The Atlantic goes on to note, there are quite a few reasons not to take Milgram’s disturbing results so seriously.

“There’s a lot of dirty laundry in those archives,” said Arthur Miller, a professor emeritus of psychology at Miami University and another co-editor of the Journal of Social Issues. “Critics of Milgram seem to want to—and do—find material in these archives that makes Milgram look bad or unethical or, in some cases, a liar.”

One of the most vocal of those critics is Australian author and psychologist Gina Perry, who documented her experience tracking down Milgram’s research participants in her 2013 book Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments. Her project began as an effort to write about the experiments from the perspective of the participants—but when she went back through the archives to confirm some of their stories, she said, she found some glaring issues with Milgram’s data. Among her accusations: that the supervisors went off script in their prods to the teachers, that some of the volunteers were aware that the setup was a hoax, and that others weren’t debriefed on the whole thing until months later. “My main issue is that methodologically, there have been so many problems with Milgram’s research that we have to start re-examining the textbook descriptions of the research”, she said.

Granted, even with these flaws, psychologists and laymen alike would agree with the basic findings  of Milgram’s work: the our inherently social species is obedient to a fault and hews strongly towards notions of duty and hierarchy. The few studies that have been conducted since Milgram’s — scientific standards have become far more ethically rigorous, posing many legal and practical challenges in recreating the experiment — seem to confirm this unfortunate conclusion.

But this is besides the point, for the issue is not whether Milgram is wrong about human nature and our capacity to perpetrate moral crimes, but whether his findings are more nuanced than they appear.

In recent years, though, much of the attention has focused less on supporting or discrediting Milgram’s statistics, and more on rethinking his conclusions. With a paper published earlier this month in the British Journal of Social Psychology, Matthew Hollander, a sociology Ph.D. candidate at the University of Wisconsin, is among the most recent to question Milgram’s notion of obedience. After analyzing the conversation patterns from audio recordings of 117 study participants, Hollander found that Milgram’s original classification of his subjects—either obedient or disobedient—failed to capture the true dynamics of the situation. Rather, he argued, people in both categories tried several different forms of protest—those who successfully ended the experiment early were simply better at resisting than the ones that continued shocking.

“Research subjects may say things like ‘I can’t do this anymore’ or ‘I’m not going to do this anymore,'” he said, even those who went all the way to 450 volts. “I understand those practices to be a way of trying to stop the experiment in a relatively aggressive, direct, and explicit way.”

It’s a far cry from Milgram’s idea that the capacity for evil lies dormant in everyone, ready to be awakened with the right set of circumstances. The ability to disobey toxic orders, Hollander said, is a skill that can be taught like any other—all a person needs to learn is what to say and how to say it.

In other words, people can just as easily be conditioned to behave ethically in violation of legal and social pressure as the other way around. Humans might have a capacity for evil, but so, too, do they have a capacity for good. Ideas, worldviews, cultural norms, and values all play a role. It is not as if every individual or society is more or less innately capable of being influenced to evil ends.

And as one analyst points out, in the case of Milgram’s study, were indeed right to believe what their authorities told them:

[People] have learned that when experts tell them something is all right, it probably is, even if it does not seem so. (In fact, it is worth noting that in this case the experimenter was indeed correct: it was all right to continue giving the “shocks”—even though most of the subjects did not suspect the reason.

Moreover, as Cracked writer  points out, Milgram apparently just fudged the results to fit his desired conclusion, which given the cultural milieu at the time

It turns out the 65 percent statistic only applies to one 40-person subgroup in an experiment involving 700 subjects. When you look at all his findings, you’ll see that only half the people involved thought the experiment was real. Of the people fooled, two-thirds refused to continue with the zap-attack. If you’re keeping track, that’s the exact opposite of the statistic Milgram published.

And it’s not just the results that are wrong: The whole experiment was more rigged than your student loan repayment plan. Milgram said that anyone who refused [to shock] would be classified as “disobedient,” but he started ignoring that rule as soon as his subjects stopped electrocuting dudes. In one presumably awkward instance, he had the actor playing the doctor give the “kill” order to one woman 26 times before giving up.

 

So there are many reasons to question Milgram’s conclusions, from the flaws nature of the study to an inadequate explanation of its findings. But it gets more complicated as more recent research gets to the bottom of the obedience problem.

A study cited in Nature, for example, hewed as closely to Milgram’s experiment as possible without treading any ethical boundaries (it also exclusively involved women, including the experimenters). The results also showed an alarmingly high willingness among participants to obey instructions that would lead to someone’s pain and discomfort.

This, time, however, researchers pinned the cause down to the “lack of agency” one feels being given orders; in other words, you feel less responsible and blameworthy for an action if someone above you told you to do it.

And unlike in Milgram’s experiments, the focus of the researchers was not so much on the troubling human instinct to obey, but on the way we treat leaders: “If people acting under orders can indeed feel reduced responsibility”, remarks one of the lead researchers, “society perhaps needs to hold people who give orders more strongly to account”. So even if the root of the problem is in our “herd mentality”, the easiest solution may be not to change this innate characteristic but to structure our society and values so as to make leaders more accountable and better scrutinized — hence why most of the worst atrocities take place not in stable, liberal democracies, but in conditions in which law and order have broken down, power is vested in authoritarian rulers, information is suppressed, etc.

It remains to be seen what future research will reveal about this important aspect of human social psychology. It seems clear that we have some sort of natural disposition to organize ourselves in a hierarchical  fashion, and that this makes us susceptible to the guidance and commands of those above us. Social order demands that we know our place and not rock the boat, even if it means following objectionable orders. There are no doubt ways to move past this instinct, as various individuals, institutions, and whole societies have shown, but we need to do a better job of understanding the complex intersection of social, psychological, and even evolutionary factors behind it.

Of course, I am a layman when it comes to psychology, and and thus not qualified to assess the matter in any rigorous scientific way. Feel free to weigh in with your own thoughts, especially if you have a scientific background.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s