The Situationist

Posts Tagged ‘Stanley Milgram’

Group Influence

Posted by The Situationist Staff on June 2, 2011

From the instructional video series Psychology: The Human Experience:

Influence explains individuality, group behavior, and deindividuation.

Related Situationist posts:

 

 

 

Posted in Classic Experiments, Conflict, History, Ideology, Morality, Social Psychology, Video | Tagged: , , , , , | 1 Comment »

Video on the Original Milgram Experiment

Posted by The Situationist Staff on April 26, 2010

From Wikipedia:

The Milgram experiment on obedience to authority figures was a series of social psychologyexperiments conducted by Yale University psychologist Stanley Milgram, which measured the willingness of study participants to obey an authority figure who instructed them to perform acts that conflicted with their personal conscience. Milgram first described his research in 1963 in an article published in the Journal of Abnormal and Social Psychology, and later discussed his findings in greater depth in his 1974 book, Obedience to Authority: An Experimental View.

The experiments began in July 1961, three months after the start of the trial of German Nazi war criminal Adolf Eichmann in Jerusalem. Milgram devised his psychological study to answer the question: “Was it that Eichmann and his accomplices in the Holocaust had mutual intent, in at least with regard to the goals of the Holocaust?” In other words, “Was there a mutual sense of morality among those involved?” Milgram’s testing suggested that it could have been that the millions of accomplices were merely following orders, despite violating their deepest moral beliefs.

After the jump, you can watch an outstanding video, including some original footage, about the experiment.  (We’ve placed it after the jump, because it plays automatically.)

Read the rest of this entry »

Posted in Choice Myth, Classic Experiments, Social Psychology, Video | Tagged: , | 5 Comments »

The Situation of Stanley Milgram’s Obedience Experiments

Posted by The Situationist Staff on April 3, 2010

Nestar John Charles Russell is publishing an article, titled “Milgram’s obedience to authority experiments: Origins and early evolution.”  Here’s the abstract.

Stanley Milgram’s Obedience to Authority experiments remain one of the most inspired contributions in the field of social psychology. Although Milgram undertook more than 20 experimental variations, his most (in)famous result was the first official trial run–the remote condition and its 65% completion rate. Drawing on many unpublished documents from Milgram’s personal archive at Yale University, this article traces the historical origins and early evolution of the obedience experiments. Part 1 presents the previous experiences that led to Milgram’s conception of his rudimentary research idea and then details the role of his intuition in its refinement. Part 2 traces the conversion of Milgram’s evolving idea into a reality, paying particular attention to his application of the exploratory method of discovery during several pilot studies. Both parts illuminate Milgram’s ad hoc introduction of various manipulative techniques and subtle tension-resolving refinements. The procedural adjustments continued until Milgram was confident that the first official experiment would produce a high completion rate, a result contrary to expectations of people’s behaviour. Showing how Milgram conceived of, then arrived at, this first official result is important because the insights gained may help others to determine theoretically why so many participants completed this experiment.

* * *

You can download the article for free here. (Thanks to Situationist friend, Brandon Weiss, for sending us this link.)

For a sample of related Situationist posts, “Milgram Replicated on French TV – ‘The Game of Death’,” A Shocking Situation,” “Zimbardo on Milgram and Obedience – Part I,”  “The Case for Obedience,” Replicating Milgram’s Obedience Experiment – Yet Again,” “Jonestown (The Situation of Evil) Revisited,” Milgram Remake,” and The Milgram Experiment Today?.”

Posted in Abstracts, Classic Experiments, Social Psychology | Tagged: , , | 2 Comments »

Virtual Milgram

Posted by The Situationist Staff on May 14, 2009

virtual reality - FlickrFrom ICT Results:

Despite advances in computer graphics, few people would think virtual characters or objects are real. Yet placed in a virtual reality environment most people will interact with them as if they are really there. European researchers are finding out why.

In trying to understand presence – the propensity of humans to respond to fake stimuli as if they are real – the researchers are not just gaining insights into how the human brain functions. They are also learning how to create more intense and realistic virtual experiences, opening the door to myriad applications for healthcare, training, social research and entertainment.

“Virtual environments could be used by psychiatrists to help people overcome anxiety disorders and phobias . . . by researchers to study social behaviour not practically or ethically reproduced in the real world, or to create more immersive virtual reality for entertainment,” explains Mel Slater, a computer scientist at ICREA in Barcelona and University College, London, who led the team behind the research.

Working in the EU-funded Presenccia project, Slater and his team, drawn from fields as diverse as neuroscience, psychology, psychophysics, mechanical engineering and philosophy, conducted a variety of experiments to understand why humans interpret and respond to virtual stimuli the way they do and how those experiences can be made more intense.

For one experiment they developed a virtual bar, which test subjects enter by donning a virtual reality (VR) headset or immersing themselves in a VR CAVE in which stereo images are projected onto the walls. As the virtual patrons socialise, drink and dance, a fire breaks out. Sometimes the virtual characters ignore it, sometimes they flee in panic. That in turn dictates how the real test subjects, immersed in the virtual environment, respond.

Panic and pain . . . virtually

“We have had people literally run out of the VR room, even though they know that what they are witnessing is not real,” says Slater. “They take their cues from the other characters.”

In another instance, the researchers re-enacted controversial experiments conducted by American social psychologist Stanley Milgram in the 1960s that showed people’s propensity to follow orders even if they know what they are doing is wrong. Instead of using a real actor, as Milgram did, the Presenccia team used a virtual character to which the test subject was instructed to give progressively more intense electric shocks whenever it answered questions incorrectly. The howls of pain and protest from the character, a virtual woman, increased as the experiment went on.

“Some of the test subjects felt so uncomfortable that they actually stopped participating and left the VR environment. Around half said they wanted to leave, but said they did not because they kept telling themselves it wasn’t real,” Slater says.

All had physical reactions, measured by their skin conductivity, perspiration and heart rate, showing that, at a subconscious level, people’s responses are similar regardless of whether what they are experiencing is real or virtual. The plausibility of the events enhances the sense that what is happening is real. Plausibility, Slater says, is therefore more important to presence than the quality of the graphics in a VR environment.

For example, when a test subject was made to stand on the edge of a virtual pit, staring down at an 18-metre drop, their level of anxiety increased if they could see dynamically changing shadows and reflections of their virtual body even if the graphics were poor. In other experiments, the researchers made people believe that a virtual hand was their own – replicating in VR the so-called “rubber hand illusion” – or that they were looking at themselves from another angle, creating a kind of out-of-body experience. In one trial, they even gave male test subjects a woman’s body.

Help with phobias and paranoia

By understanding what makes people perceive virtual objects and experiences to be real, the researchers hope to create applications that could revolutionise certain psychiatric treatments. Patients with a fear of spiders or heights, for example, could be exposed to and helped to overcome their fears in virtual reality. Similarly people who are shy or paranoid about public speaking could be helped by having to face virtual people and crowds.

“One application we are working on is designed to help shy men overcome their fear of meeting women by making them interact with a virtual woman,” Slater says.

The technology is also being used for social research which, much like the Milgram experiments, would not be practical or ethical to conduct in the real world. One experiment due to be run at University College, London, will use a virtual environment to study how people respond to violence in public places, such as a bar fight between football hooligans.

Besides healthcare and research, more immersive VR would also help in training, potentially greatly improving the results of flight or driving simulators. Slater also envisions VR environments being used to train people to use prosthetic limbs and wheelchairs through mind control before trying them out in the real world. A brain-computer interface (BCI) developed for just such a purpose was tested in the Presenccia project and in a similarly named predecessor called Presencia, which received funding under the EU’s Sixth and Fifth Framework Programmes for research, respectively.

* * *

Vodpod videos no longer available.

* * *

Though immersive VR is likely to have many applications in healthcare, research and training, the biggest market is probably entertainment. With the cost of VR technology coming down, people could eventually be exploring virtual worlds and interacting with virtual characters and other people through VR rooms in their homes akin to the “holodecks” seen in Star Trek, Slater says.

* * *

You can read the story and watch a related video at ICT Results, here.  To read some related Situationist posts, see “Virtual Worlds, Learning, and Virtual Milgram,” “The Positive Situation of Crowds,”Virtual Bias,” “Are Video Games Addictive?,” and “Resident Evil 5 and Racism in Video Games.”

Posted in Classic Experiments, Illusions, Video | Tagged: , , , | 1 Comment »

The Situational Effect of Groups

Posted by The Situationist Staff on April 17, 2009

Silent Crowd (tochis)In his Guardian article, “Hands up if you’re an individual,” Stuart Jeffries offers a brief summary of some social psychology classics.  Below, we have included excerpts.  After reviewing Milgram’s famous experiments on obedience, Jeffries writes:

* * *

This was one of the classic experiments of group psychology, though not all have involved duping volunteers into believing they had electrocuted victims. Group psychology has often involved experiments to explain how individuals’ behaviours, thoughts and feelings are changed by group pressures.

It is generally thought to have originated in 1898 when Indiana University psychologist Norman Triplett asked children to spin a fishing reel as fast as they could. He found that when the children were doing the task together they did so much faster than when alone. Triplett found a similar result when studying cyclists – they tended to record faster times when riding in groups rather than alone, a fact that he explained because the “bodily presence of another contestant participating simultaneously in the race serves to liberate latent energy not ordinarily available”.

More than a century later, social psychology explores how other people make us what we are; how unconscious, sometimes ugly, impulses make us compliant and irrational. Why, for example, do I smoke even though I know it could be fatal? How can there be such a gap between my self-image and my behaviour (this is known as cognitive dissonance)?

Why do high-level committees of supposed experts make disastrous decisions (for example, when a Nasa committee dismissed technical staff warnings that the space shuttle Challenger should not be launched, arguing that technical staff were just the kind of people to make such warnings – this is seen as a classic case of so-called “groupthink”)?

Why do we unconsciously obey others even when this undermines our self-images (this is known as social influence)? What makes us into apathetic bystanders when we see someone attacked in the street – and what makes us have-a-go-heroes? What makes peaceful crowds turn into rioting mobs?

Group psychological studies can have disturbing ramifications. Recently, Harvard psychologist [and Situationist contributor] Mahzarin Banaji used the so-called implicit association test to demonstrate how unconscious beliefs inform our behaviour. [Sh]e concluded from [her] research that the vast majority of white, and many black respondents recognised negative words such as “angry”, “criminal” or “poor” more quickly after briefly seeing a black face than a white one. . . .

* * *

The nature of conformism has obsessed social psychologists for decades. In 1951, psychologist Solomon Asch did an experiment in which volunteers were asked to judge the correct length of a line by comparing it with three sample lines. The experiment was set up so that there was an obviously correct answer. But Asch had riddled a group with a majority of stooges who deliberately chose the wrong answer. The pressure of the majority told on Asch’s volunteers. He found that 74% conformed with the wrong answer at least once, and 32% did so all the time.

What impulses were behind such conformism? Social psychologists have long considered that we construct our identities on the basis of others’ attitudes towards us. Erving Goffman, in The Presentation of Self in Everyday Life (1959), analysed social encounters as if each person was engaged in a dramatic performance, and suggested that each such actor was a creation of its audience.

Through such performances of self we internalise role expectations and gain positive self-esteem. We cast other individuals and groups in certain roles. Such behaviour may make some of us unconscious racists, but it also lubricates the wheels of social life.

French psychologist Serge Moscovici developed what is called social representation theory, arguing that shared beliefs and explanations held by a group of society help people to communicate effectively with one another. He explored the notion of anchoring, whereby new ideas or events in social life are given comforting redescriptions (or social representations). For example, a group of protesters against a motorway might be described demeaningly by the road lobby as a “rent-a-mob,” while the protesters themselves might anchor themselves more falteringly as “eco-warriors”.

* * *

Social psychologists have also been long-obsessed by the psychology of crowds. In 1895, French social psychologist Gustave le Bon described crowds as mobs in which individuals lost their personal consciences. His book, The Crowd: A Study of the Popular Mind, influenced Hitler and led many later psychologists to take a dim view of crowds.

After the war, German critical theorist Theodor Adorno wrote of the destructive nature of “group psychology.” Even as late as 1969, Stanford psychologist [and Situationist contributor] Philip Zimbardo argued that a process of deindividuation makes participants in crowds less rational.

Most recent crowd psychology has not been content to brand crowds necessarily irrational. Instead, it has divided into contagion theory (whereby crowds cause people to act in a certain way), convergence theory (where crowds amount to a convergence of already like-minded individuals) and emergent norm theory (where crowd behaviour reflects the desires of participants, but it is also guided by norms that emerge as the situation unfolds). . . .

In the age of MySpace, Facebook and online dating, group psychologists are now trying to find out what goes on when we present ourselves to the world online, how we are judged for doing so and how groups are formed online. Other social psychology touches on such voguish areas of research as social physics (which contends that physical laws might explain group behaviour) and neuroeconomics (which looks at the role of the brain when we evaluate decisions and interact with each other), but the age-old concerns remain part of our zeitgeist.

* * *

You can read the entire article here.   For a sample of Situationist posts examining the interaction of individuals and groups, see “The Situational Benefits of Outsiders,” Racism Meets Groupism and Teamism,” Racism Meets Groupism and Teamism,” “‘Us’ and ‘Them,’” “The Maverickiness Paradox,” “Four Failures of Deliberating Groups – Abstract,” “Team-Interested Decision Making,” “History of Groupthink,” “Some (Interior) Situational Sources War – Part I,” and “March Madness,” To read some of the previous Situationist posts describing or discussing classic experiments from soical psychology and related fields, click here.

Posted in Choice Myth, Classic Experiments, Conflict, Implicit Associations, Situationist Contributors, Social Psychology | Tagged: , , , , , , , , , , , , , , | 1 Comment »

Zimbardo on Milgram and Obedience – Part II

Posted by The Situationist Staff on April 16, 2009

Milgram's StudentSituationist Contributer Philip Zimbardo has authored the preface to a new edition of social psychologist Stanley Milgram’s seminal book Obedience to Authority. This is the second of a two-part series derived from that preface. In Part I of the post, Zimbardo describes the inculcation of obedience and Milgram’s role as a research pioneer. In this part, Zimbardo answers challenges to Milgram’s work and locates its legacy.

* * *

Unfortunately, many psychologists, students, and lay people who believe that they know the “Milgram Shock” study, know only one version of it, most likely from seeing his influential movie Obedience or reading a textbook summary.

He has been challenged for using only male participants, which was true initially, but later he replicated his findings with females. He has been challenged for relying only on Yale students, because the first studies were conducted at Yale University. However, the Milgram obedience research covers nineteen separate experimental versions, involving about a thousand participants, ages twenty to fifty, of whom none are college or high school students! His research has been heavily criticized for being unethical by creating a situation that generated much distress for the person playing the role of the teacher believing his shocks were causing suffering to the person in the role of the learner. I believe that it was seeing his movie, in which he includes scenes of distress and indecision among his participants, that fostered the initial impetus for concern about the ethics of his research. Reading his research articles or his book does not convey as vividly the stress of participants who continued to obey authority despite the apparent suffering they were causing their innocent victims. I raise this issue not to argue for or against the ethicality of this research, but rather to raise the issue that it is still critical to read the original presentations of his ideas, methods, results, and discussions to understand fully what he did. That is another virtue of this collection of Milgram’s obedience research.

A few words about how I view this body of research. First, it is the most representative and generalizable research in social psychology or social sciences due to his large sample size, systematic variations, use of a diverse body of ordinary people from two small towns—New Haven and Bridgeport, Connecticut—and detailed presentation of methodological features. Further, its replications across many cultures and time periods reveal its robust effectiveness.

As the most significant demonstration of the power of social situations to influence human behavior, Milgram’s experiments are at the core of the situationist view of behavioral determinants. It is a study of the failure of most people to resist unjust authority when commands no longer make sense given the seemingly reasonable stated intentions of the just authority who began the study. It makes sense that psychological researchers would care about the judicious use of punishment as a means to improve learning and memory. However, it makes no sense to continue to administer increasingly painful shocks to one’s learner after he insists on quitting, complains of a heart condition, and then, after 330 volts, stops responding at all. How could you be helping improve his memory when he was unconscious or worse? The most minimal exercise of critical thinking at that stage in the series should have resulted in virtually everyone refusing to go on, disobeying this now heartlessly unjust authority. To the contrary, most who had gone that far were trapped in what Milgram calls the “agentic state.”

These ordinary adults were reduced to mindless obedient school children who do not know how to exit from a most unpleasant situation until teacher gives them permission to do so. At that critical juncture when their shocks might have caused a serious medical problem, did any of them simply get out of their chairs and go into the next room to check on the victim? Before answering, consider the next question, which I posed directly to Stanley Milgram: “After the final 450 volt switch was thrown, how many of the participant-teachers spontaneously got out of their seats and went to inquire about the condition of their learner?” Milgram’s answer: “Not one, not ever!” So there is a continuity into adulthood of that grade-school mentality of obedience to those primitive rules of doing nothing until the teacher-authority allows it, permits it, and orders it.

My research on situational power (the Stanford Prison Experiment) complements that of Milgram in several ways. They are the bookends of situationism: his representing direct power of authority on individuals, mine representing institutional indirect power over all those within its power domain. Mine has come to represent the power of systems to create and maintain situations of dominance and control over individual behavior. In addition, both are dramatic demonstrations of powerful external influences on human action, with lessons that are readily apparent to the reader, and to the viewer. (I too have a movie, Quiet Rage, that has proven to be quite impactful on audiences around the world.) Both raise basic issues about the ethics of any research that engenders some degree of suffering and guilt from participants. I discuss at considerable length my views on the ethics of such research in my recent book The Lucifer James Monroe High School BronxEffect: Understanding Why Good People Turn Evil (2008). When I first presented a brief overview of the Stanford Prison Experiment at the annual convention of the American Psychological Association in 1971, Milgram greeted me joyfully, saying that now I would take some of the ethics heat off his shoulders by doing an even more unethical study!

Finally, it may be of some passing interest to readers of this book, that Stanley Milgram and I were classmates at James Monroe High School in the Bronx (class of 1950), where we enjoyed a good time together. He was the smartest kid in the class, getting all the academic awards at graduation, while I was the most popular kid, being elected by senior class vote to be “Jimmie Monroe.” Little Stanley later told me, when we met ten years later at Yale University, that he wished he had been the most popular, and I confided that I wished I had been the smartest. We each did what we could with the cards dealt us. I had many interesting discussions with Stanley over the decades that followed, and we
almost wrote a social psychology text together. Sadly, in 1984 he died prematurely from a heart attack at the age of fifty-one.

[Milgram] left us with a vital legacy of brilliant ideas that began with those centered on obedience to authority and extended into many new realms—urban psychology, the small-world problem, six degrees of separation, and the Cyrano effect, among others—always using a creative mix of methods. Stanley Milgram was a keen observer of the human landscape, with an eye ever open for a new paradigm that might expose old truths or raise new awareness of hidden operating principles. I often wonder what new phenomena Stanley would be studying now were he still alive.

* * *

To read Part I of this post, click here.  To read three related Situationist posts by Phil Zimbardo, see “The Situation of Evil,” Part I, Part II, and Part III.

Posted in Book, Classic Experiments, Social Psychology | Tagged: , , , | 4 Comments »

Zimbardo on Milgram and Obedience – Part I

Posted by The Situationist Staff on April 14, 2009

Milgram Obedience ExperimentSituationist contributer Philip Zimbardo has authored the preface to a new edition of social psychologist Stanley Milgram’s pathbreaking and now-classic book Obedience to Authority.  This is the first of a two-part series derived from that preface.  In this post, Zimbardo describes the inculcation of obedience and Milgram’s role as a research pioneer.  In Part II, Zimbardo answers challenges to Milgram’s work and locates its legacy.

* * *

What is common about two of the most profound narratives in Western culture—Lucifer’s descent into Hell and Adam and Eve’s loss of Paradise—is the lesson of the dreadful consequences of one’s failure to obey authority. . . [T]hey are designed, as all parables are, to send a powerful message to all those who hear and read them: Obey authority at all costs! The consequences of disobedience to authority are formidable and damnable. Once created, these myths and parables get passed along by subsequent authorities, now parents, teachers, bosses, politicians, and dictators, among others, who want their word to be followed without dissent or challenge.

Thus, as school children, in virtually all traditional educational settings, the rules of law that we learned and lived were: Stay in your seat until permission is granted by the teacher to stand and leave it; do not talk unless given permission by the teacher to do so after having raised your hand to seek that recognition, and do not challenge the word of the teacher or complain. So deeply ingrained are these rules of conduct that even as we age and mature they generalize across many settings as permanent placards of our respect for authority. However, not all authority is just, fair, moral, and legal, and we are never given any explicit training in recognizing that critical difference between just and unjust authority.  The just one deserves respect and some obedience, maybe even without much questioning, while the unjust variety should arouse suspicion and distress, ultimately triggering acts of challenge, defiance, and revolution.

Stanley Milgram’s series of experiments on obedience to authority, so clearly and fully presented in this new edition of his work, represents some of the most significant investigations in all the social sciences of the central dynamics of this aspect of human nature. His work was the first to bring into the controlled setting of an experimental laboratory an investigation into the nature of obedience to authority. In a sense, he is following in the tradition of Kurt Lewin, although he is not generally considered to be in the Lewinian tradition, as Leon Festinger, Stanley Schachter, Lee Ross, and Richard Nisbett are, for example. Yet to study phenomena that have significance in their real world existence within the constraints and controls of a laboratory setting is at the essence of one of Lewin’s dictums of the way social psychology should proceed.

This exploration of obedience was initially motivated by Milgram’s reflections on the ease with which the German people obeyed Nazi authority in discriminating against Jews and, eventually, in allowing Hitler’s Final Solution to be enacted during the Holocaust. As a young Jewish man, he wondered if the Holocaust could be recreated in his own country, despite the many differences in those cultures and historical epochs. Though many said it could never happen in the United States, Milgram doubted whether we should be so sure. Believing in the goodness of people does not diminish the fact that ordinary, even once good people, just following orders, have committed much evil in the world.

British author C. P. Snow reminds us that more crimes against humanity have been committed in the name of obedience than disobedience. Milgram’s mentor, Solomon Asch, had earlier demonstrated the power of groups to sway the judgments of intelligent college students regarding false conceptions of visual reality. But that influence was indirect, creating a discrepancy between the group norm and the individual’s perception of the same stimulus event.

Conformity to the group’s false norm was the resolution to that discrepancy, with participants behaving in ways that would lead to group acceptance rather than rejection. Milgram wanted to discover the direct and immediate impact of one powerful individual’s commands to another person to behave in ways that challenged his or her conscience and morality. He designed his research paradigm to pit our general beliefs about what people would do in such a situation against what they actually did when immersed in that crucible of human nature.

* * *

We’ll post Part II of this series later this week.  You can review a sizeable collection of Situationist posts discussing the work of Stanley Milgram here.

Posted in Book, Classic Experiments, Conflict, Social Psychology | Tagged: , , , , | 4 Comments »

The Positive Situation of Crowds

Posted by The Situationist Staff on March 8, 2009

crowdsThe Economist has an interesting piece on the psychology of crowds.  We excerpt the piece below.

* * *

One researcher who is interested in this approach is Mark Levine, a social psychologist at Lancaster University in Britain who studies crowds. Crowds have a bad press. They have been blamed for antisocial behaviour through mechanisms that include peer pressure, mass hysteria and the diffusion of responsibility—the idea that “someone else will do something, so I don’t have to”. But Dr Levine thinks that crowds can also diffuse potentially violent situations and that crime would be much higher if it were not for crowds. As he told a symposium called “Understanding Violence,” which was organised by the Ecole Polytechnique Fédérale de Lausanne in Switzerland earlier this month, he has been using CCTV data to examine the bystander effect, an alleged phenomenon whereby people who would help a stranger in distress if they were alone, fail to do so in the presence of others. His conclusion is that it ain’t so. In fact, he thinks, having a crowd around often makes things better.

* * *

Dr Levine talks of a “collective choreography” of violence, in which the crowd determines the outcome as much as the protagonist and the target do, and he is now taking his ideas into the laboratory. In collaboration with Mel Slater, a computer scientist at University College, London, he is looking at the responses of bystanders to violence recreated in virtual reality.

Dr Slater has pioneered this approach, since people seem to react to virtual reality as they do to real life, but no one gets hurt and conditions can be controlled precisely. Because the participants know it is not real, many of the ethical obstacles to placing them in such situations are removed. But Dr Slater proved the tool’s usefulness in 2006, when he used it to recreate a famous experiment conducted in the 1960s by Stanley Milgram, an American psychologist. Milgram showed that ordinary people would obey orders to the point of delivering potentially lethal electric shocks to strangers—an experiment that, even though nobody really received any shocks, would be ruled out today, on ethical grounds. Dr Slater’s volunteers behaved similarly to Milgram’s.

* * *

For the rest of the piece, click here.  For related Situationist pieces on Stanely Milgram, click here.

Posted in Life | Tagged: , , , | 2 Comments »

Disobedience at 150 volts

Posted by The Situationist Staff on December 26, 2008

For our many readers interested in the Milgram obedience experiments, Dominic J. Packer published a valuable paper, “Identifying Systematic Disobedience in Milgram’s Obedience Experiments: A Meta-Analytic Review” (3 Perspectives on Psychol. Sci. 3-1 (2008)).  Here’s the abstract.

* * *

A meta-analysis of data from eight of Milgram’s obedience experiments reveals previously undocumented systematicity in the behavior of disobedient participants. In all studies, disobedience was most likely at 150 v, the point at which the shocked “learner” first requested to be released. Further illustrating the importance of the 150-v point, obedience rates across studies covaried with rates of disobedience at 150 v, but not at any other point; as obedience decreased, disobedience at 150 v increased. In contrast, disobedience was not associated with the learner’s escalating expressions of pain. This analysis identifies a critical decision point in the obedience paradigm and suggests that disobedient participants perceived the learner’s right to terminate the experiment as overriding the experimenter’s orders, a finding with potential implications for the treatment of prisoners.

* * *

To download a pdf draft version of the article, click here. For a collection of Situationist posts discussing Stanley Milgram’s obedience experiments, click here.

Posted in Classic Experiments, Social Psychology | Tagged: , , | 3 Comments »

A Shocking Situation

Posted by The Situationist Staff on December 22, 2008

milgram-burger-shock-boxLisa M. Krieger recently published a nice summary of Jerry Burger’s replications of Milgram’s obedience experiment.  Her article in the San Jose Mercury News is titled “Shocking Revelation: Santa Clara University Professor Mirrors Famous Torture Studay.”  Here are some excerpts.

* * *

Replicating one of the most controversial behavioral experiments in history, a Santa Clara University psychologist has found that people will follow orders from an authority figure to administer what they believe are painful electric shocks.

More than two-thirds of volunteers in the research study had to be stopped from administering 150 volt shocks of electricity, despite hearing a person’s cries of pain, professor Jerry M. Burger concluded in a study published in the January issue of the journal American Psychologist.

“In a dramatic way, it illustrates that under certain circumstances people will act in very surprising and disturbing ways,” said Burger.

The study, using paid volunteers from the South Bay, is similar to the famous 1974 “obedience study” by the late Yale University psychologist Stanley Milgram. In the wake of Nazi war criminal Adolf Eichmann’s trial, Milgram was troubled by the willingness of people to obey authorities — even if it conflicted with their own conscience.

* * *

The subjects — recruited in ads in the Mercury News, Craigslist and fliers distributed in libraries and communities centers in Santa Clara, Cupertino and Sunnyvale — thought they were testing the effect of punishment on learning.”They were average citizens, a typical cross-section of people that you’d see around every day,” said Burger.

* * *

Burger found that 70 percent of the participants had to be stopped from escalating shocks over 150 volts, despite hearing cries of protest and pain. Decades earlier, Milgram found that 82.5 percent of participants continued administering shocks. Of those, 79 percent continued to the shock generator’s end, at 450 volts.

Burger’s experiment did not go that far.

“The conclusion is not: ‘Gosh isn’t this a horrible commentary on human nature,’ or ‘these people were so sadistic,” said Burger.

“It shows the opposite — that there are situational forces that have a much greater impact on our behavior than most people recognize,” he said.

The experiment shows that people are more likely to comply with instructions if the task starts small, then escalates, according to Burger.

“For instance, the suicides at Jonestown were just the last step of many,” he said. “Jim Jones started small, asking people to donate time and money, then looked for more and more commitment.”

Additionally, the volunteers confronted a novel situation — having never before been in such a setting, they had no idea of how they were supposed to act, he said.

Finally, they had been told that they should not feel responsible for inflicting pain; rather, the “instructor” was accountable. “Lack of feeling responsible can lead people to act in ways that they might otherwise not,” said Burger.

“When we see people acting out of character, the first thing we should ask is: ‘What’s going on in this situation?”’

* * *

To read the entire article, click here.  To watch an ABC news video about Professor Burger’s research, click on the video below.

* * *


* * *

To read a previos Situationist post about Professor Burger’s research, see “The Milgram Experiment Today?.”  For Situationist posts about the Jonestown massacre, see “Jonestown (The Situation of Evil) Revisited.”  For a collection of Situationist posts discussing Stanley Milgram’s obedience experiments, click here.

Posted in Classic Experiments, Social Psychology, Video | Tagged: , , | 1 Comment »

Jonestown (The Situation of Evil) Revisited

Posted by Philip Zimbardo on November 17, 2008

With the 30th Anniversary of the Jonestown Mass Suicide upon us, now is a good time to republish the three-part Situationist series from 2007 on the “Situational Sources of Evil” — published also in the January/February 2007 edition of the Yale Alumni Magazine and based my  book, The Lucifer Effect: Understanding How Good People Turn Evil (Random House, March 2007).

* * *

Imagine that you have responded to an advertisement in the New Haven newspaper seeking subjects for a study of memory. AMilgram Advertisement researcher whose serious demeanor and laboratory coat convey scientific importance greets you and another applicant at your arrival at a Yale laboratory in Linsly-Chittenden Hall. You are here to help science find ways to improve people’s learning and memory through the use of punishment. The researcher tells you why this work may have important consequences. The task is straightforward: one of you will be the “teacher” who gives the “learner” a set of word pairings to memorize. During the test, the teacher will give each key word, and the learner must respond with the correct association. When the learner is right, the teacher gives a verbal reward, such as “Good” or “That’s right.” When the learner is wrong, the teacher is to press a lever on an impressive-looking apparatus that delivers an immediate shock to punish the error.

The shock generator has 30 switches, starting from a low level of 15 volts and increasing by 15 volts to each higher level. The experimenter tells you that every time the learner makes a mistake, you have to press the next switch. The control panel shows both the voltage of each switch and a description. The tenth level (150 volts) is “Strong Shock”; the 17th level (255 volts) is “Intense Shock”; the 25th level (375 volts) is “Danger, Severe Shock.” At the 29th and 30th levels (435 and 450 volts) the control panel is marked simply with an ominous XXX: the pornography of ultimate pain and power.

You and another volunteer draw straws to see who will play each role; you are to be the teacher, and the other volunteer will be the learner. He is a mild-mannered, middle-aged man whom you help escort to the next chamber. “Okay, now we are going to set up the learner so he can get some punishment,” the experimenter tells you both. The learner’s arms are strapped down and an electrode is attached to his right wrist.“Learner” being strapped in The generator in the next room will deliver the shocks. The two of you communicate over an intercom, with the experimenter standing next to you. You get a sample shock of 45 volts — the third level, a slight tingly pain — so you have a sense of what the shock levels mean. The researcher then signals you to start.

Initially, your pupil does well, but soon he begins making errors, and you start pressing the shock switches. He complains that the shocks are starting to hurt. You look at the experimenter, who nods to continue. As the shock levels increase in intensity, so do the learner’s screams, saying he does not think he wants to continue. You hesitate and question whether you should go on. But the experimenter insists that you have no choice.

James Monroe High SchoolIn 1949, seated next to me in senior class at James Monroe High School in the Bronx, New York, was my classmate, Stanley Milgram. We were both skinny kids, full of ambition and a desire to make something of ourselves, so that we might escape life in the confines of our ghetto experience. Stanley was the little smart one who we went to for authoritative answers. I was the tall popular one, the smiling guy other kids would go to for social advice.

I had just returned to Monroe High from a horrible year at North Hollywood High School, where I had been shunned and friendless (because, as I later learned, there was a rumor circulating that I was from a New York Sicilian Mafia family). Back at Monroe, I would be chosen “Jimmy Monroe” — most popular boy in Monroe High School’s senior class. Stanley and I once discussed how that transformation could happen. We agreed that I had not changed; the situation was what mattered.

Situational psychology is the study of the human response to features of our social environment, the external behavioral context, above all to the other people around us. Stanley Milgram and I, budding situationists in 1949, both went on to become academic social psychologists. We met again at Yale in 1960 as beginning assistant professors — him starting out at Yale, me at NYU. Some of Milgram’s new research wasStanley Milgram conducted in a modified laboratory that I had fabricated a few years earlier as a graduate student — in the basement of Linsly-Chittenden, the building where we taught Introductory Psychology courses. That is where Milgram was to conduct his classic and controversial experiments on blind obedience to authority.

Milgram’s interest in the problem of obedience came from deep personal concerns about how readily the Nazis had obediently killed Jews during the Holocaust. His laboratory paradigm, he wrote years later, “gave scientific expression to a more general concern about authority, a concern forced upon members of my generation, in particular upon Jews such as myself, by the atrocities of World War II.”

As Milgram described it, he hit upon the concept for his experiment while musing about a study in which one of his professors, Solomon Asch, had tested how far subjects would conform to the judgment of a group. Asch had put each subject in a group of coached confederates and asked every member, one by one, to compare a set of lines in order of length. When the confederates all started giving the same obviously false answers, 70 percent of the subjects agreed with them at least some of the time.

Milgram wondered whether there was a way to craft a conformity experiment that would be “more humanly significant” than judgments about line length. He wrote later: “I wondered whether groups could pressure a person into performing an act whose human import was more readily apparent; perhaps behaving aggressively toward another person, say by administering increasingly severe shocks to him. But to study the group effect . . . you’d have to know how the subject performed without any group pressure. At that instant, my thought shifted, zeroing in on this experimental control. Just how far would a person go under the experimenter’s orders?”

How far up the scale do you predict that you would go under those orders? Put yourself back in the basement with the fake shock apparatus and the other “volunteer” — actually the experimenter’s confederate, who always plays the learner because the “drawing” is rigged — strapped down in the next room. As the shocks proceed, the learner begins complaining about his heart condition. You dissent, but the experimenter still insists that you continue. The learner makes errors galore. You plead with your pupil to concentrate; you don’t want to hurt him. But your concerns and motivational messages are to no avail. He gets the answers wrong again and again. As the shocks intensify, he shouts out, “I can’t stand the pain, let me out of here!” Then he says toMilgram’s Subject 1 the experimenter, “You have no right to keep me here!” Another level up, he screams, “I absolutely refuse to answer any more! You can’t hold me here! My heart’s bothering me!”

Obviously you want nothing more to do with this experiment. You tell the experimenter that you refuse to continue. You are not the kind of person who harms other people in this way. You want out. But the experimenter continues to insist that you go on. He reminds you of the contract, of your agreement to participate fully. Moreover, he claims responsibility for the consequences of your shocking actions. After you press the 300-volt switch, you read the next keyword, but the learner doesn’t answer. “He’s not responding,” you tell the experimenter. You want him to go into the other room and check on the learner to see if he is all right. The experimenter is impassive; he is not going to check on the learner. Instead he tells you, “If the learner doesn’t answer in a reasonable time, about five seconds, consider it wrong,” since errors of omission must be punished in the same way as errors of commission — that is a rule.

As you continue up to even more dangerous shock levels, there is no sound coming from your pupil’s shock chamber. He may be unconscious or worse. You are truly disturbed and want to quit, but nothing you say works to get your exit from this unexpectedly distressing situation. You are told to follow the rules and keep posing the test items and shocking the errors.

Now try to imagine fully what your participation as the teacher would be. If you actuallyMilgram’s Subject 2 go all the way to the last of the shock levels, the experimenter will insist that you repeat that XXX switch two more times. I am sure you are saying, “No way would I ever go all the way!” Obviously, you would have dissented, then disobeyed and just walked out. You would never sell out your morality. Right?

Milgram once described his shock experiment to a group of 40 psychiatrists and asked them to estimate the percentage of American citizens who would go to each of the 30 levels in the experiment. On average, they predicted that less than 1 percent would go all the way to the end, that only sadists would engage in such sadistic behavior, and that most people would drop out at the tenth level of 150 volts. They could not have been more wrong.

In Milgram’s experiment, two of every three (65 percent) of the volunteers went all the way up to the maximum shock level of 450 volts. The vast majority of people shocked the victim over and over again despite his increasingly desperate pleas to stop. Most participants dissented from time to time and said they did not want to go on, but the researcher would prod them to continue.

Over the course of a year, Milgram carried out 19 different experiments, each one a different variation of the basic paradigm. In each of these studies he varied one social psychological variable and observed its impact. In one study, he added women; in others he varied the physical proximity or remoteness of either the experimenter-teacher link or the teacher-learner link; had peers rebel or obey before the teacher had the chance to begin; and more.

Milgram’s Bridgeport LocationIn one set of experiments, Milgram wanted to show that his results were not due to the authority power of Yale University. So he transplanted his laboratory to a run-down office building in downtown Bridgeport, Connecticut, and repeated the experiment as a project ostensibly of a private research firm with no connection to Yale. It made hardly any difference; the participants fell under the same spell of this situational power.

The data clearly revealed the extreme pliability of human nature: depending on the situation, almost everyone could be totally obedient or almost everyone could resist authority pressures. Milgram was able to demonstrate that compliance rates could soar to over 90 percent of people continuing to the 450-volt maximum or be reduced to less than 10 percent — by introducing just one crucial variable into the compliance recipe.

Want maximum obedience? Make the subject a member of a “teaching team,” in which the job of pulling the shock lever to punish the victim is given to another person (a confederate), while the subject assists with other parts of the procedure. Want resistance to authority pressures? Provide social models — peers who rebel. Participants also refused to deliver the shocks if the learner said he wanted to be shocked; that’s masochistic, and they are not sadists. They were also reluctant to give high levels of shock when the experimenter filled in as the learner. They were more likely to shock when the learner was remote than in proximity.

In each of the other variations on this diverse range of ordinary American citizens, of widely varying ages and occupations and of both genders, it was possible to elicit low, medium, or high levels of compliant obedience with a flick of the situational switch. Milgram’s large sample — a thousand ordinary citizens from varied backgrounds — makes the results of his obedience studies among the most generalizable in all the social sciences. His classic study has been replicated and extended by many other researchers in many countries.

Recently, Thomas Blass of the University of Maryland-Baltimore County [author of The Man Who Shocked The World and creator of the terrific website StanleyMilgram.Com] analyzed the rates of obedience in eight studies conducted in the United States and nine replications in European, African, and Asian countries. He found comparably high levels of compliance in all. The 61 percent mean obedience rate found in the U.S. was matched by the 66 percent rate found across all the other national samples. The degree of obedience was not affected by the timing of the studies, which ranged from 1963 to 1985.

Other studies based on Milgram’s have shown how powerful the obedience effect can be when legitimate authorities exercise their power within their power domains. In one Nurse Ratchedstudy, most college students administered shocks to whimpering puppies when required to do so by a professor. In another, all but one of 22 nurses flouted their hospital’s procedure by obeying a phone order from an unknown doctor to administer an excessive amount of a drug (actually a placebo); that solitary disobedient nurse should have been given a raise and a hero’s medal. In still another, a group of 20 high school students joined a history teacher’s supposed authoritarian political movement, and within a week had expelled their fellows from class and recruited nearly 200 others from around the school to the cause.

Now we ask the question that must be posed of all such research: what is its external validity, what are real-world parallels to the laboratory demonstration of authority power?

Hannah ArendtIn 1963, the social philosopher Hannah Arendt published what was to become a classic of our times, Eichmann in Jerusalem: A Report on the Banality of Evil. She provides a detailed analysis of the war crimes trial of Adolf Eichmann, the Nazi figure who personally arranged for the murder of millions of Jews. Eichmann’s defense of his actions was similar to the testimony of other Nazi leaders: “I was only following orders.” What is most striking in Arendt’s account of Eichmann is all the ways in which he seemed absolutely ordinary: half a dozen psychiatrists had certified him as “normal.” Arendt’s famous conclusion: “The trouble with Eichmann was precisely that so many were like him, and that the many were neither perverted nor sadistic, that they were, and still are, terribly and terrifyingly normal.”

Arendt’s phrase “the banality of evil” continues to resonate because genocide has been unleashed around the world and torture and terrorism continue to be common features of our global landscape. A few years ago, the sociologist and Brazil expert Martha Huggins, the Greek psychologist and torture expert Mika Haritos-Fatouros, and I interviewed several dozen torturers. These men did their daily dirty deeds for years in Brazil as policemen, sanctioned by the government to get confessions by torturing “subversive” enemies of the state.Voilence Workers

The systematic torture by men of their fellow men and women represents one of the darkest sides of human nature. Surely, my colleagues and I reasoned, here was a place where dispositional evil would be manifest. The torturers shared a common enemy: men, women, and children who, though citizens of their state, even neighbors, were declared by “the System” to be threats to the country’s national security — as socialists and Communists. Some had to be eliminated efficiently, while others, who might hold secret information, had to be made to yield it up by torture, confess and then be killed.

Torture always involves a personal relationship; it is essential for the torturer to understand what kind of torture to employ, what intensity of torture to use on a certain person at a certain time. Wrong kind or too little — no confession. Too much — the victim dies before confessing. In either case, the torturer fails to deliver the goods and incurs the wrath of the senior officers. Learning to determine the right kind and degree of torture that yields up the desired information elicits abounding rewards and flowing praise from one’s superiors. It took time and emerging insights into human weaknesses for these torturers to become adept at their craft.

What kind of men could do such deeds? Did they need to rely on sadistic impulses and a history of sociopathic life experiences to rip and tear the flesh of fellow beings day in and day out for years on end?

In a recent study of 400 al-Qaeda members, 90% came from caring, intact families.

We found that sadists are selected out of the training process by trainers because they are not controllable. They get off on the pleasure of inflicting pain, and thus do not sustain the focus on the goal of extracting confessions. From all the evidence we could muster, torturers were not unusual or deviant in any way prior to practicing their new roles, nor were there any persisting deviant tendencies or pathologies among any of them in the years following their work as torturers and executioners. Their transformation was entirely explainable as being the consequence of a number of situational and systemic factors, such as the training they were given to play this new role; their group camaraderie; acceptance of the national security ideology; and their learned belief in socialists and Communists as enemies of their state.

Young Bin Laden

Amazingly, the transformation of these men into violence workers is comparable to the transformation of young Palestinians into suicide bombers intent on killing innocent Israeli civilians. In a recent study, the forensic psychiatrist Marc Sageman [at the Solomon Asch Center] found evidence of the normalcy of 400 al-Qaeda members. Three-quarters came from the upper or middle class. Ninety percent came from caring, intact families. Two-thirds had gone to college; two-thirds were married; and most had children and jobs in science and engineering. In many ways, Sageman concludes, “these are the best and brightest of their society.”

Israeli psychologist Ariel Merari, who has studied this phenomenon extensively for many years, outlines the common steps on the path to these explosive deaths. First, senior members of an extremist group identify young people who, based on their declarations at a public rally against Israel or their support of some Islamic cause or Palestinian action, appear to have an intense patriotic fervor. Next, they are invited to discuss how seriously they love their country and hate Israel. They are asked to commit to being trained. Those who do then become part of a small secret cell of three to five youths. From their elders, they learn bomb making, disguise, and selecting and timing targets. Finally, they make public their private commitment by making a videotape, declaring themselves to be “the living martyr” for Islam. The recruits are also told the Big Lie: their relatives will be entitled to a high place in Heaven, and they themselves will earn a place beside Allah. Of course, the rhetoric of dehumanization serves to deny the humanity and innocence of their victims.Suicide Bombers

The die is cast; their minds have been carefully prepared to do what is ordinarily unthinkable. In these systematic ways a host of normal, angry young men and women become transformed into true believers. The suicide, the murder, of any young person is a gash in the fabric of the human family that we elders from every nation must unite to prevent. To encourage the sacrifice of youth for the sake of advancing the ideologies of the old must be considered a form of evil that transcends local politics and expedient strategies.

A host of normal, angry young men and women become transformed into true believers.

Our final extension of the social psychology of evil from artificial laboratory experiments to real-world contexts comes to us from the jungles of Guyana. There, on November 28, 1978, an American religious leader persuaded more than 900 of his followers to commit mass suicide. In the ultimate test of blind obedience to authority, many of them killed their children on his command.

Jim JonesJim Jones, the pastor of Peoples Temple congregations in San Francisco and Los Angeles, had set out to create a socialist utopia in Guyana. But over time Jones was transformed from the caring, spiritual “father” of a large Protestant congregation into an Angel of Death. He instituted extended forced labor, armed guards, semistarvation diets, and daily punishments amounting to torture for the slightest breach of any of his many rules. Concerned relatives convinced a congressman and media crew to inspect the compound. But Jones arranged for them to be murdered as they left. He then gathered almost all the members at the compound and gave a long speech in which he exhorted them to take their lives by drinking cyanide-laced Kool-Aid.

Jones was surely an egomaniac; he had all of his speeches and proclamations, even his torture sessions, tape-recorded — including his final suicide harangue. In it Jones distorts, lies, pleads, makes false analogies, appeals to ideology and to transcendent future life, and outright insists that his orders be followed, all while his staff is efficiently distributing deadly poison to the hundreds gathered around him. Some excerpts from that last hour convey a sense of the death-dealing tactics he used to induce total obedience to an authority gone mad:

Please get us some medication. It’s simple. It’s simple. There’s no convulsions with it. [Of course there are, especially for the children.] . . . Don’t be afraid to die. You’ll see, there’ll be a few people land[ing] out here. They’ll torture some of our children here. They’ll torture our people. They’ll torture our seniors. We cannot have this. . . . Please, can we hasten? Can we hasten with that medication? . . . We’ve lived — we’ve lived as no other people lived and loved. We’ve had as much of this world as you’re gonna get. Let’s just be done with it. (Applause.). . . . Who wants to go with their child has a right to go with their child. I think it’s humane. . . . Lay down your life with dignity. Don’t lay down with tears and agony. There’s nothing to death. . . . It’s just stepping over to another plane. Don’t beJonestown Massacre this way. Stop this hysterics. . . . Look, children, it’s just something to put you to rest. Oh, God. (Children crying.). . . . Mother, Mother, Mother, Mother, Mother, please. Mother, please, please, please. Don’t — don’t do this. Don’t do this. Lay down your life with your child.

And they did, and they died for “Dad.”

A fitting conclusion comes from psychologist Mahrzarin Banaji: “What social psychology has given to an understanding of human nature is the discovery that forces larger than ourselves determine our mental life and our actions — chief among these forces [is] the power of the social situation.”

The most dramatic instances of directed behavior change and “mind control” are not the consequence of exotic forms of influence such as hypnosis, psychotropic drugs, or “brainwashing.” They are, rather, the systematic manipulation of the most mundane aspects of human nature over time in confining settings. Motives and needs that ordinarily serve us well can lead us astray when they are aroused, amplified, or manipulated by situational forces that we fail to recognize as potent. This is why evil is so pervasive. Its temptation is just a small turn away, a slight detour on the path of life, a blur in our sideview mirror, leading to disaster.

Milgram crafted his research paradigm to find out what strategies can seduce ordinary citizens to engage in apparently harmful behavior. Many of these methods have parallels to compliance strategies used by “influence professionals” in real-world settings, such as salespeople, cult and military recruiters, media advertisers, and others. Below are ten of the most effective.

1

Prearranging some form of contractual obligation, verbal or written, to control the individual’s behavior in pseudo-legal fashion. In Milgram’s obedience study, subjects publicly agreed to accept the tasks and the procedures.

2

Giving participants meaningful roles to play — “teacher,” “learner” — that carry with them previously learned positive values and automatically activate response scripts.

3

Presenting basic rules to be followed that seem to make sense before their actual use but can then be used arbitrarily and impersonally to justify mindless compliance. The authorities will change the rules as necessary but will insist that rules are rules and must be followed (as the researcher in the lab coat did in Milgram’s experiment).

4

Altering the semantics of the act, the actor, and the action — replacing unpleasant reality with desirable rhetoric, gilding the frame so that the real picture is disguised: from “hurting victims” to “helping the experimenter.” We can see the same semantic framing at work in advertising, where, for example, bad-tasting mouthwash is framed as good for you because it kills germs and tastes like medicine.

5

Creating opportunities for the diffusion of responsibility or abdication of responsibility for negative outcomes, such that the one who acts won’t be held liable. In Milgram’s experiment, the authority figure, when questioned by a teacher, said he would take responsibility for anything that happened to the learner.

6

Starting the path toward the ultimate evil act with a small, seemingly insignificant first step, the easy “foot in the door” that swings open subsequent greater compliance pressures. In the obedience study, the initial shock was only a mild 15 volts. This is also the operative principle in turning good kids into drug addicts with that first little hit or sniff.

7

Having successively increasing steps on the pathway that are gradual, so that they are hardly noticeably different from one’s most recent prior action. “Just a little bit more.”

8

Gradually changing the nature of the authority figure from initially “just” and reasonable to “unjust” and demanding, even irrational. This tactic elicits initial compliance and later confusion, since we expect consistency from authorities and friends. Not acknowledging that this transformation has occurred leads to mindless obedience. And it is part of many date rape scenarios and a reason why abused women stay with their abusing spouses.

9

Making the exit costs high and making the process of exiting difficult; allowing verbal dissent, which makes people feel better about themselves, while insisting on behavioral compliance.

10

Offering a “big lie” to justify the use of any means to achieve the seemingly desirable, essential goal. (In Milgram’s research the justification was that science will help people improve their memory by judicious use of reward and punishment.) In social psychology nazi-propaganda.jpgexperiments, this is known as the “cover story”; it is a cover-up for the procedures that follow, which do not make sense on their own. The real-world equivalent is an ideology. Most nations rely on an ideology, typically “threats to national security,” before going to war or suppressing political opposition. When citizens fear that their national security is being threatened, they become willing to surrender their basic freedoms in exchange. Erich Fromm’s classic analysis in Escape from Freedom made us aware of this trade-off, which Hitler and other dictators have long used to gain and maintain power.

Procedures like these are used when those in authority know that few would engage in the endgame without being prepared psychologically to do the unthinkable. But people who understand their own impulses to join with a group and to obey an authority may be able also to withstand those impulses at times when the mandate from outside comes into conflict with their own values and conscience. In the future, when you are in a compromising position where your compliance is at issue, thinking back to these ten stepping-stones to mindless obedience may enable you to step back and not go all the way down the path — their path. A good way to avoid crimes of obedience is to assert one’s personal authority and to always take full responsibility for one’s actions. Resist going on automatic pilot, be mindful of situational demands on you, engage your critical thinking skills, and be ready to admit an error in your initial compliance and to say, “Hell, no, I won’t go your way.”

* * *

Below you can find several excellent videos of the Jonestown massacre and the circumstances leading up to it.

From PBS, here is a (fuzzy) 84-minute video “Jonestown: The Life And Death Of Peoples Temple.”

Here is a briefer but clearer 45-minute, video “Jonestown: The Final Report.”

Posted in Book, Classic Experiments, History, Morality, Social Psychology, Video | Tagged: , , , , , | Leave a Comment »

The Situation of John Yoo and the Torture Memos

Posted by The Situationist Staff on May 13, 2008

Situationist friend Andrew Perlman recently published a terrific editorial in The National Law Journal on the situation of John Yoo, “The ‘Torture Memos’: Lessons for all of us.” Here are a few excerpts.

* * *

John Yoo - from WikipediaIt is easy to believe that John Yoo wrote his widely discredited “torture memos” because he holds radical views of presidential authority or because he has some unusual moral failing. The reality, however, may be far more ordinary and disturbing: He willfully followed the lead of White House officials who were eager to find a legal justification for torture. The banality of Yoo’s compliance shouldn’t excuse him in any way, but his mistakes can help us understand why attorneys might offer equally troubling legal advice in much less public settings.

We can draw some valuable insights in this regard from one of the most stunning social psychology experiments ever conducted. More than 40 years ago, Stanley Milgram found that, under the right conditions, an experimenter could successfully order more than 60% of adults to administer what they believed to be painful and dangerous electric shocks to an innocent, bound older man with a heart condition, despite the man’s repeated pleas to be let go. In essence, Milgram found that people are surprisingly likely to obey authority figures under certain conditions.

Social psychologists have identified many of the conditions that tend to promote this type of wrongful obedience . . . . [For a related list, see Zimbardo’s Situationist post here.]

Notably, the conditions that produced obedience in Milgram’s experiments probably also existed at the Office of Legal Counsel (OLC) when Yoo wrote his infamous memos.

* * *

Of course, none of this justifies Yoo’s conduct or excuses him in any way. Indeed, Jack Goldsmith, the subsequent head of the OLC, rescinded some of Yoo’s memos and successfully resisted the same pressures that Yoo faced. Nevertheless, by trying to understand why Yoo would have offered fundamentally wrong legal advice, we can gain insights into why other lawyers in more commonplace professional settings might offer similarly bad advice to powerful figures, whether they are White House officials, important law firm partners or corporate executives. Our obedience might take the form of following a client’s request to bury a discoverable “smoking gun” document or offering evidence that we know is false, but the forces at work are ultimately the same

* * *

To read all of Professor Perlman’s article, click here. In addition, we recommend his forthcoming law review article, “Unethical Obedience by Subordinate Attorneys: Lessons from Social Psychology” (forthcoming 36 Hofstra Law Review, 2007, available on SSRN), the abstract of which we have pasted below.

* * *

This Article explores the lessons that we can learn from social psychology regarding a lawyer’s willingness to comply with authority figures, such as senior partners or deep-pocketed clients, when they make unlawful or unethical demands. The Article reviews some of the basic literature in social psychology regarding conformity and obedience, much of which emphasizes the importance of context as a primary factor in predicting people’s behavior. The Article then contends that lawyers frequently find themselves in the kinds of contexts that produce high levels of conformity and obedience and low levels of resistance to illegal or unethical instructions. The result is that subordinate lawyers will find it difficult to resist a superior’s commands in circumstances that should produce forceful dissent. Finally, the Article proposes several changes to existing law in light of these insights, including giving lawyers the benefit of whistleblower protection, strengthening a lawyer’s duty to report the misconduct of other lawyers, and enhancing a subordinate lawyer’s responsibilities upon receiving arguably unethical instructions from a superiors.

* * *

To watch a fascinating 4-minute interview by Bill Moyers of Jack Goldsmith about the situation of the DOJ decision making, click on the video below.

To view a 45-minute video in which Jack Goldsmith discusses the surprising role of lawyers in the war on terror, click on the following video.

Vodpod videos no longer available.

For a sample of related posts, see the series by Situationist contributor Sung Hui Kim, “Why Do Lawyers Acquiesce In Their Clients’ Misconduct?, Part I, Part II, and Part III, and the post by Situationist contributor David Yosifon, “On the Ethical Obligations of Lawyers.”

Finally, to listen to a fascinating set of This American Life stories of “the Bush Administration, its unique style of asserting presidential authority, and its quest to redefine the limits of presidential power” generally, click here.

Posted in Abstracts, History, Law, Morality, Politics, Social Psychology, Uncategorized, Video | Tagged: , , , , , , , , , | 1 Comment »