This program explores psychologists’ attempts to understand human behavior within its broader social context. It also examines how beliefs and behavior can be influenced and manipulated by other people and subtle situational forces.
THE BELIEVER: I take it that one of the goals of the Stanford Prison Experiment was to build on Milgram’s results that demonstrated the power of situational elements. Is that right?
PHILIP ZIMBARDO: It was really to broaden his message and put it to a higher-level test. In Milgram’s study, we don’t know about those thousand people who answered the ad. His subjects were not Yale students, although he did it at Yale. They were a thousand ordinary citizens from New Haven and Bridgeport, Connecticut, ages twenty to fifty, and in his advertisement in the newspaper he said: college students and high-school students cannot be used. It could have been a selection of people who were more psychopathic. For our study, we picked only two dozen of seventy-five who applied, who on seven different personality tests were normal or average. So we knew there were no psychopaths, no deviants. Nobody had been in therapy, and even though it was a drug era, nobody (at least in the reports) had taken anything more than marijuana, and they were physically healthy at the time. So the question was: Suppose you had only kids who were normally healthy, psychologically and physically, and they knew they would be going into a prison-like environment and that some of their civil rights would be sacrificed. Would those good people, put in that bad, evil place—would their goodness triumph?
That sitautionist snippet should convince you to check out the rest of the interview! Also, it is worth pointing out the Sommers has a forthcoming collection entitled A Very Bad Wizard: Morality Behind the Curtain, which includes past interviews with philosophers and psychologists such as Galen Strawson, Michael Ruse, Jon Haidt, Frans de Waal, Steve Stich, Josh Greene, Liane Young, Joe Henrich, William Ian Miller, and Zimbardo. So, make sure to check it out as well once it comes out.
Situationist Contributer Philip Zimbardo has authored the preface to a new edition of social psychologist Stanley Milgram’s seminal book Obedience to Authority. This is the second of a two-part series derived from that preface. In Part I of the post, Zimbardo describes the inculcation of obedience and Milgram’s role as a research pioneer. In this part, Zimbardo answers challenges to Milgram’s work and locates its legacy.
* * *
Unfortunately, many psychologists, students, and lay people who believe that they know the “Milgram Shock” study, know only one version of it, most likely from seeing his influential movie Obedience or reading a textbook summary.
He has been challenged for using only male participants, which was true initially, but later he replicated his findings with females. He has been challenged for relying only on Yale students, because the first studies were conducted at Yale University. However, the Milgram obedience research covers nineteen separate experimental versions, involving about a thousand participants, ages twenty to fifty, of whom none are college or high school students! His research has been heavily criticized for being unethical by creating a situation that generated much distress for the person playing the role of the teacher believing his shocks were causing suffering to the person in the role of the learner. I believe that it was seeing his movie, in which he includes scenes of distress and indecision among his participants, that fostered the initial impetus for concern about the ethics of his research. Reading his research articles or his book does not convey as vividly the stress of participants who continued to obey authority despite the apparent suffering they were causing their innocent victims. I raise this issue not to argue for or against the ethicality of this research, but rather to raise the issue that it is still critical to read the original presentations of his ideas, methods, results, and discussions to understand fully what he did. That is another virtue of this collection of Milgram’s obedience research.
A few words about how I view this body of research. First, it is the most representative and generalizable research in social psychology or social sciences due to his large sample size, systematic variations, use of a diverse body of ordinary people from two small towns—New Haven and Bridgeport, Connecticut—and detailed presentation of methodological features. Further, its replications across many cultures and time periods reveal its robust effectiveness.
As the most significant demonstration of the power of social situations to influence human behavior, Milgram’s experiments are at the core of the situationist view of behavioral determinants. It is a study of the failure of most people to resist unjust authority when commands no longer make sense given the seemingly reasonable stated intentions of the just authority who began the study. It makes sense that psychological researchers would care about the judicious use of punishment as a means to improve learning and memory. However, it makes no sense to continue to administer increasingly painful shocks to one’s learner after he insists on quitting, complains of a heart condition, and then, after 330 volts, stops responding at all. How could you be helping improve his memory when he was unconscious or worse? The most minimal exercise of critical thinking at that stage in the series should have resulted in virtually everyone refusing to go on, disobeying this now heartlessly unjust authority. To the contrary, most who had gone that far were trapped in what Milgram calls the “agentic state.”
These ordinary adults were reduced to mindless obedient school children who do not know how to exit from a most unpleasant situation until teacher gives them permission to do so. At that critical juncture when their shocks might have caused a serious medical problem, did any of them simply get out of their chairs and go into the next room to check on the victim? Before answering, consider the next question, which I posed directly to Stanley Milgram: “After the final 450 volt switch was thrown, how many of the participant-teachers spontaneously got out of their seats and went to inquire about the condition of their learner?” Milgram’s answer: “Not one, not ever!” So there is a continuity into adulthood of that grade-school mentality of obedience to those primitive rules of doing nothing until the teacher-authority allows it, permits it, and orders it.
My research on situational power (the Stanford Prison Experiment) complements that of Milgram in several ways. They are the bookends of situationism: his representing direct power of authority on individuals, mine representing institutional indirect power over all those within its power domain. Mine has come to represent the power of systems to create and maintain situations of dominance and control over individual behavior. In addition, both are dramatic demonstrations of powerful external influences on human action, with lessons that are readily apparent to the reader, and to the viewer. (I too have a movie, Quiet Rage, that has proven to be quite impactful on audiences around the world.) Both raise basic issues about the ethics of any research that engenders some degree of suffering and guilt from participants. I discuss at considerable length my views on the ethics of such research in my recent book The Lucifer Effect: Understanding Why Good People Turn Evil (2008). When I first presented a brief overview of the Stanford Prison Experiment at the annual convention of the American Psychological Association in 1971, Milgram greeted me joyfully, saying that now I would take some of the ethics heat off his shoulders by doing an even more unethical study!
Finally, it may be of some passing interest to readers of this book, that Stanley Milgram and I were classmates at James Monroe High School in the Bronx (class of 1950), where we enjoyed a good time together. He was the smartest kid in the class, getting all the academic awards at graduation, while I was the most popular kid, being elected by senior class vote to be “Jimmie Monroe.” Little Stanley later told me, when we met ten years later at Yale University, that he wished he had been the most popular, and I confided that I wished I had been the smartest. We each did what we could with the cards dealt us. I had many interesting discussions with Stanley over the decades that followed, and we
almost wrote a social psychology text together. Sadly, in 1984 he died prematurely from a heart attack at the age of fifty-one.
[Milgram] left us with a vital legacy of brilliant ideas that began with those centered on obedience to authority and extended into many new realms—urban psychology, the small-world problem, six degrees of separation, and the Cyrano effect, among others—always using a creative mix of methods. Stanley Milgram was a keen observer of the human landscape, with an eye ever open for a new paradigm that might expose old truths or raise new awareness of hidden operating principles. I often wonder what new phenomena Stanley would be studying now were he still alive.
* * *
To read Part I of this post, click here. To read three related Situationist posts by Phil Zimbardo, see “The Situation of Evil,” Part I, Part II, and Part III.
On April 28, 2004, four years ago, our nation, and the world, was shocked by the revelation of the abuse and torture of Iraqi prisoners by American soldiers. More surprising than the fact of the abuse, for soldiers often abuse their enemies in wartime, was the nature of the “trophy photos.” Both male and female Military Police posed smilingly, giving high fives over a pyramid of naked detainees; dragging some around on dog leashes; and forcing others into sexually degrading poses. An iconic image of torture emerged from the digitally documented depravity which was shown in a helpless prisoner standing on a cardboard box, head hooded, electrodes attached to his fingers, fearing that when his body weakened and he fell off the stress box, he would electrocute himself.
Recall that the immediate response of the top military command and the Bush civilian command pronounced these acts as the work of a “few rogue soldiers,” as the moral failures of a few “Bad Apples.” General Richard Myers, head of the Joint Chiefs of Staff, added in his televised interview that he was certain such abuses were not “systemic,” but should be blamed entirely on the immorality of those few culprits. Donald Rumsfeld, told the Senate Armed Services Committee, “These abuses happened on my watch. As Secretary of Defense, I am fully responsible.” Without a full scale investigation it was not possible at that time to determine whether such abuse was limited to Tier 1A Abu Ghraib, or was in fact, more widespread. The statements were simply urgent damage control to protect the reputation of America’s military and Bush’s war on terrorism. Rumsfeld’s acknowledgment of responsibility did not extend to a recognition of accountability or personal liability – for him or subsequently for any senior military staff: i.e., those who should have borne Command Responsibility for abuses, of which they should have been aware, that were inflicted by their subordinates given they occurred nightly over three long months.
Indeed, the bad apple refrain is played over and over again whenever there is a scandal in our police departments, prisons, and the military, or corporate worlds. Such attribution of evil deeds to the moral disposition of those who commit them fueled the feverish search for infidels in the decades of the Inquisition in many Catholic nations around the world. Focusing entirely on personal defects in the make-up of the culprits ignores the contextual circumstances in which the abuses occurred. However, proper understanding of any complex human behavior involves examination of the dynamic interplay between what actor brings into the behavioral setting and what the social-situational forces operating upon them bring out of those actors. Moreover, the crucial question that must be asked is what is the nature of the system of power that creates, maintains, and justifies situations that produce evil behavior and it also involves an awareness that the line between good and evil is not fixed, but sufficiently permeable to allow ordinary, even good, people to cross over and do really bad deeds at a given time in a particular setting.
As “Superintendent” of the mock Stanford Prison that I created as a simulation to be used in an experiment, I witnessed college student participants, purposely selected “good apples,” become corrupted by the situational forces operating in the “bad barrel” that I had designed. Normal, healthy students role-playing guards quickly began to abuse their prisoners so much so that many role-playing prisoners suffering from acute, extreme stress reactions had to be released. Our planned two-week study had to be terminated after only six days because the whole situation was running out of control. Sensing a similar scenario at work in the Abu Ghraib prison, I accepted the task of being an expert witness for the defense of the Army Reserve sergeant in charge of the MP battalion on the night shift of Tier 1A. As such, I had access to all the investigative reports issued by high-ranking generals and civilian officials, access to the infamous disks that were filled with a thousand trophy photos, and also direct access to the soldier himself by means of personal interviews, psychological testing, and a detailed investigation of his background. After reviewing all this material as well as interviewing military criminal investigators and knowledgeable officers, I concluded in the testimony I gave at his military trial, that although he was guilty as charged, the severity of his sentence should be mitigated both because of his exemplary character and the horrendous situational circumstances in which he and his buddies were forced to work and live.
Before his dungeon tour of duty Chip Frederick had been a remarkably patriotic, honored soldier, and a model citizen. The psych evaluation by a military psychologist concluded that he was normal on all measures; there was no evidence of any sadistic tendencies. Hardly the stuff of a bad apple.
But the situation in which he had to work could not have been worse. The prison was filthy and chaotic. It was subject to frequent blackouts and under almost constant bombardment. Prisoners were attempting to escape. There were no established standard operating procedures, and there was never any oversight or surveillance by officers. This young soldier was forced to work 12-hour shifts, 7 days a week for 40 consecutive nights without a break. When the unexpected insurgency burst out in Fall 2003, large scale arrests of suspected Iraqi men and boys swelled the prison population on Tier 1A from 200 to 1000 prisoners, without increase in the number of the 9 guards under his charge—none of whom had any mission-specific training. The massive assault of negative social psychological forces that had been at work in the Stanford mock Prison was overwhelmingly present in that all too real military prison.
How is the System implicated in these abuses? Tier 1A was under the control of Military Intelligence to be used as their interrogation site. The CIA backed up the control and civilian contract interrogators also conducted interrogations there. When their interrogations failed to elicit “actionable intelligence” (because most detainees had none to give), these authorities pressured the Army Reserve MPs to help them “soften up” the detainees, “take the gloves off,” do whatever was necessary to get them to spill the beans. The intentional absence of surveillance by senior officers during the night shift, coupled with praise of the MPs for breaking prisoners who did talk, and protected by the assurance of deniability for any specific abuses, the System that operated that dungeon provided an open-ended license for torture and abuse.
Many of the official investigative reports indict the system for the failure or absence of leadership, for conflicting leadership, and for the recruitment of these untrained MPs to abuse prisoners. The report by Brig. Gen. Antonio Taguba went further to identify specific officers whom he found guilty of dereliction of duty. Because he openly blamed the system, General Taguba was forced to resign prematurely; essentially he was fired for doing his job too conscientiously.
Chip’s sentence: dishonorable discharge, an 8-year prison term and forfeiture of 22 years of retirement income; he was also stripped of the 9 medals and awards he had earned. Cpl. Charles Garner got 10 years and Lynddie England 3 years; there were lesser sentences for the other MPs staffing that little shop of horrors in Tier 1A. Blame therefore was deflected onto the grunts to enable the big shots running any system to get away with murder and to avoid the vital messages about changing behavioral contexts that breed abuse, inhumanity, and criminal action. Errol Morris’s film, “Standard Operating Procedure,” released on the anniversary of the exposure of abuse at Abu Ghraib, confirms my thesis.
Such transgressions do not occur where military discipline is clear and oversight is practiced, where there is censure for violations and praise for honorable conduct. Rather, most systems of governance are veiled in secrecy; transparency is their enemy. Evil is a slippery slope that always starts with small transgressions and escalates gradually when human character is transformed by the power of social situations — while most good people observe and do nothing, thereby are guilty of the evil of inaction.
* * *
For a collection of posts by or about Situationist contributor, Phil Zimbardo, click here. For other Situationist posts discussing The Lucifer Effect, click here. To buy the paperback version of the book, click here.