The Situationist

Posts Tagged ‘Phil Zimbardo’

Phil Zimbardo at HLS “We Need Heroes”

Posted by The Situationist Staff on March 6, 2012

Vodpod videos no longer available.

Related Situationist posts:

Posted in Altruism, Classic Experiments, Events, Life, Morality, Positive Psychology, Situationist Contributors, Social Psychology, Video | Tagged: , , | Leave a Comment »

Zimbardo’s Stanford Prison and Kingsfield’s Harvard Law

Posted by The Situationist Staff on October 31, 2011

Last week, Phil Zimbardo delivered another remarkable lecture at Harvard Law School — this time tracing his journey from studying evil to inspiring heroism.  We hope to post that video in several weeks.  For his introduction, Situationist Editor Jon Hanson assembled this short video comparing Professor Zimbardo’s Prison Experiment and Professor Kingsfield’s Harvard Law School (The Paper Chase), both of which reached their 40th anniversary this year.

Related Situationist posts:

 

Posted in Classic Experiments, Education, Events, Situationist Contributors, Social Psychology, Video | Tagged: , , , , | Leave a Comment »

Zimbo at HLS Today!

Posted by The Situationist Staff on October 26, 2011

Posted in Events, Situationist Contributors, Social Psychology | Tagged: , , , | Leave a Comment »

Evil No! Heroes Yes!! (Zimbardo returns to Harvard Law)

Posted by The Situationist Staff on October 23, 2011



Open to the public.

Related Situationist posts:

Posted in Events, Situationist Contributors, Social Psychology | Tagged: , , , | 1 Comment »

The Power of the Situation

Posted by The Situationist Staff on March 28, 2011

From Discovering Psychology:

This program explores psychologists’ attempts to understand human behavior within its broader social context. It also examines how beliefs and behavior can be influenced and manipulated by other people and subtle situational forces.

Related Situationist posts:

Posted in Classic Experiments, Morality, Social Psychology, Video | Tagged: , , , | 3 Comments »

Dr. Z. on Dr. Phil

Posted by The Situationist Staff on December 7, 2010

Heroic Imagination in Action, December 9, 2010.

Situationist Contributor, Phil Zimbardo will co-host the DR. PHIL TV show, on: Thursday, Dec. 9, 2010 (for local airing times, see www.drphil.com).

This program continues an earlier show (Oct. 25, 2010) that focused on The Lucifer Effect, understanding how good people can turn evil, and centered on the issue of obedience to authority.

The new show builds upon that theme by adding demonstrations of bullying by girls in groups, and the power of group dynamics and social trust as revealed in the recent “Bling Ring” Hollywood thefts. Millions of dollars worth of celebrity jewelry and clothing were stolen by a group of young girls, as described by one guest.

The final component shifts focus to understand how “bad kids” can turn good and even act heroically. A former member of a criminal gang in Los Angeles describes his motives for joining the gang, its illegal activities, being shot at, arrested, and his final transformation. He describes going beyond just quitting the gang to work at preventing others from joining destructive gangs, and helping them escape from gang life. In a dramatic highlight, this young man says,  “I am putting my life in danger just being here (on this public show).” His actions, like those of others like him, are heroic because of the high personal costs/risks entailed in engaging in such socially-focused service.

Dr. Phil ends the show with laudatory comments about how such actions are part of what is being investigated and encouraged by the Heroic Imagination Project (HIP), encouraging his audience to visit the web site (here).

* * *

The HIP research team is now investigating the nature of such transformations of the psychology of enmity and violence into the psychology of compassion and heroic action through detailed interviews with dozens of  “heroic, former gang members.”

For a sample of related Situationist posts, see “Situationist Phil Zimbardo Takes Over the Dr. Phil Show,” “The Devil You Know . . .,” and “From Heavens to Hells to Heroes – Part II.”

Posted in Events, Situationist Contributors, Social Psychology | Tagged: , , , | Leave a Comment »

Situationist Phil Zimbardo Takes Over the Dr. Phil Show

Posted by The Situationist Staff on October 25, 2010

Here is a brief promotional piece to highlight the Heroic Imagination Project and Situationist Contributor Phil Zimbardo’s upcoming appearances on Dr. Phil.

Vodpod videos no longer available.

Visit www.heroicimagination.org to learn more. www.drphil.com for show times.

You can watch video clips from today’s show here.

Posted in Classic Experiments, Entertainment, Situationist Contributors, Social Psychology, Video | Tagged: , | Leave a Comment »

Zimbardo Interview at The Believer

Posted by Thomas Nadelhoffer on September 6, 2009

Zimbardo montagePhilosopher Tamler Sommers was kind enough to post a link over at the Garden of Forking Paths to an interview he did with Situationist contributor Philip Zimbardo that appears in the latest edition of The Believer.  Here is the first question and answer from the interview:

***

THE BELIEVER: I take it that one of the goals of the Stanford Prison Experiment was to build on Milgram’s results that demonstrated the power of situational elements. Is that right?

PHILIP ZIMBARDO: It was really to broaden his message and put it to a higher-level test. In Milgram’s study, we don’t know about those thousand people who answered the ad. His subjects were not Yale students, although he did it at Yale. They were a thousand ordinary citizens from New Haven and Bridgeport, Connecticut, ages twenty to fifty, and in his advertisement in the newspaper he said: college students and high-school students cannot be used. It could have been a selection of people who were more psychopathic. For our study, we picked only two dozen of seventy-five who applied, who on seven different personality tests were normal or average. So we knew there were no psychopaths, no deviants. Nobody had been in therapy, and even though it was a drug era, nobody (at least in the reports) had taken anything more than marijuana, and they were physically healthy at the time. So the question was: Suppose you had only kids who were normally healthy, psychologically and physically, and they knew they would be going into a prison-like environment and that some of their civil rights would be sacrificed. Would those good people, put in that bad, evil place—would their goodness triumph?

***

That sitautionist snippet should convince you to check out the rest of the interview!  Also, it is worth pointing out the Sommers has a forthcoming collection entitled A Very Bad Wizard: Morality Behind the Curtain, which includes past interviews with philosophers and psychologists such as Galen Strawson, Michael Ruse, Jon Haidt, Frans de Waal, Steve Stich, Josh Greene, Liane Young, Joe Henrich, William Ian Miller, and Zimbardo.  So, make sure to check it out as well once it comes out.

For a sample of related Situationist posts, see, “Milgram Remake,” “Zimbardo on Milgram and Obedience Part I,” “Zimbardo on Migram and Obedience Part II,” and “Zimbardo Lecture on How Good People Turn Evil.”

Posted in Classic Experiments, Philosophy, Public Policy, Social Psychology, Uncategorized | Tagged: , , | 1 Comment »

Bush, Cheney, Rumsfeld, and Tenet: “Guilty”

Posted by Philip Zimbardo on May 8, 2009

bush-tenetMore than 10,000 people cast their votes during the last year and a half in a virtual voting booth at www.LuciferEffect.com. Their judgments accord with the recent Senate Armed Services bipartisan report that blames Bush officials for detainee abuse. It also finds that the prison guards and interrogators were not the “true culprits.”

The vast majority of these voters found all four Bush officials guilty of having created the legal frameworks, laws, and motivational conditions that provided the foundation for the abuses and torture of detainees at Abu Ghraib and Guantanamo Bay prisons. The guilty verdicts (for George W. Bush, Dick Cheney, Donald Rumsfeld, and George Tenet) were true regardless of political preference, across all age groups, and whether or not they had read The Lucifer Effect book before voting.

Democrats were more likely to vote guilty than were those identified as Republicans, but even so, the majority of Republicans found each of the four officials guilty:

  • Bush: 95 % (Democrat) to 57% (Republican);
  • Cheney: 88% to 72%;
  • Rumsfeld: 89% to 72%;
  • Tenet: 83% to 70 %.

Those identified as “Other” political preference overwhelmingly gave guilty verdicts to all four:

  • 93% Bush;
  • 96% Cheney;
  • 95 % Rumsfeld, and
  • 89 % Tenet.

The percentage of guilty votes increased systematically with age of voters for all four officials: 86% of those under age 21 found George W. Bush guilty, as did 89% of those 21-40, 93 % of those 41-60, and a high of 97% for voters over the age of 60.

For Dick Cheney, the guilt verdicts were even higher at each age level, from 88% under 21, to 93% 21-40, to 97% 41-60, and a maximum of 99% for senior voters. Similar patterns can be seen for former Sec. of Defense Rumsfeld and former head of the CIA, Tenet.

My involvement with trying to understand the causes of the abuses and torture of Iraqi prisoners at Abu Ghraib began when I agreed to be part of the defense team organized by Gary Myers, legal council for one of the Army Reserve Military Police, Staff Sergeant Chip Frederick. In that role, I read all of the many investigative reports by various generals and one headed by James Schlesinger, former Sec. of Defense. I also read all of the relevant Human Rights Watch reports, International Red Cross reports, and more. I spoke with interrogators, military criminal investigators, and senior military officers who were on that scene. After in-depth interviews with Chip Frederick and reviewing his psychological evaluation by a military specialist, and his prior service record, I felt competent in rendering the judgment that he was a “good apple.” And further, that the conditions he and the other MPs were forced to work in and live in constituted the “Bad Barrel” that corrupted him and the other prison guards on the Tier 1A night shift (where all the abuses occurred).

These findings were summarized in two chapters of a book I wrote subsequently, Chapters 14 and 15 of The Lucifer Effect (Random House, 2007). While military justice put Frederick and many of the other MPs on trial for the abuses they had perpetrated on individuals they were supposed to protect while in their custody, none of the officers who should have been in charge were ever tried. Those abuses took place over more than three months in the fall of 2003 before being exposed. Command complicity involves responsibility for illegal or immoral behavior of one’s subordinates that officers should have known about – had they cared enough to be watching the store or the torture dungeon.

My summation to the military prosecutor in Frederick’s trial (2004) stated that although the soldier on trial was guilty of the abuses for which he was charged (for which he got an 8 year prison sentence), it was the Situation and the System that were also responsible. The Situation is the complex set of environmental circumstances in operation on the night shift in the interrogation center of Tier 1A—that created horrendous conditions for our soldiers as well as the detainees. The System includes those in charge of creating and maintaining those situations by means of resource allocation, legal rules, and top-down pressures for “actionable intelligence” by all means necessary.

I ended my conceptual analysis with a call for readers of my Lucifer Effect book to play the role of jurors in deciding on the guilt and accountability of some of the military command in charge at Abu Ghraib, along with Bush officials who were the ultimate Systems Managers. However, the World-Wide Web allows us to go beyond a rhetorical message of how one might vote in this case to creating a virtual voting booth where many people could openly register their vote on the guilt of the civilian officials whom they considered to be responsible for some of these abuses and tortures.

The summary of these votes by more than 10,000 people attest to the widespread public understanding that the abuses of human rights and integrity that have been perpetrated under the banner of protecting Homeland Security are traceable up to the highest levels of our government, and not just down to the foot soldiers doing their dirty work in the trenches of war. It is encouraging that the Senate Armed Services Committee also supports this viewpoint in blaming our leaders and not just the followers.

* * *

For related Situationist posts, see “Lessons Learned from the Abu Ghraib Horrors,” “The Devil You Know . . . ,” “Common Cause: Combating the Epidemics of Obesity and Evil,” “Person X Situation X System Dynamics,” “The Lucifer Effect Lecture at Harvard Law School,” “From Heavens to Hells to Heroes – Part I,” “From Heavens to Hells to Heroes – Part II,” and “Jonestown (The Situation of Evil) Revisited.”

Posted in History, Ideology, Law, Politics, Situationist Contributors, Social Psychology | Tagged: , , , , , , , , | 1 Comment »

The Situational Effect of Groups

Posted by The Situationist Staff on April 17, 2009

Silent Crowd (tochis)In his Guardian article, “Hands up if you’re an individual,” Stuart Jeffries offers a brief summary of some social psychology classics.  Below, we have included excerpts.  After reviewing Milgram’s famous experiments on obedience, Jeffries writes:

* * *

This was one of the classic experiments of group psychology, though not all have involved duping volunteers into believing they had electrocuted victims. Group psychology has often involved experiments to explain how individuals’ behaviours, thoughts and feelings are changed by group pressures.

It is generally thought to have originated in 1898 when Indiana University psychologist Norman Triplett asked children to spin a fishing reel as fast as they could. He found that when the children were doing the task together they did so much faster than when alone. Triplett found a similar result when studying cyclists – they tended to record faster times when riding in groups rather than alone, a fact that he explained because the “bodily presence of another contestant participating simultaneously in the race serves to liberate latent energy not ordinarily available”.

More than a century later, social psychology explores how other people make us what we are; how unconscious, sometimes ugly, impulses make us compliant and irrational. Why, for example, do I smoke even though I know it could be fatal? How can there be such a gap between my self-image and my behaviour (this is known as cognitive dissonance)?

Why do high-level committees of supposed experts make disastrous decisions (for example, when a Nasa committee dismissed technical staff warnings that the space shuttle Challenger should not be launched, arguing that technical staff were just the kind of people to make such warnings – this is seen as a classic case of so-called “groupthink”)?

Why do we unconsciously obey others even when this undermines our self-images (this is known as social influence)? What makes us into apathetic bystanders when we see someone attacked in the street – and what makes us have-a-go-heroes? What makes peaceful crowds turn into rioting mobs?

Group psychological studies can have disturbing ramifications. Recently, Harvard psychologist [and Situationist contributor] Mahzarin Banaji used the so-called implicit association test to demonstrate how unconscious beliefs inform our behaviour. [Sh]e concluded from [her] research that the vast majority of white, and many black respondents recognised negative words such as “angry”, “criminal” or “poor” more quickly after briefly seeing a black face than a white one. . . .

* * *

The nature of conformism has obsessed social psychologists for decades. In 1951, psychologist Solomon Asch did an experiment in which volunteers were asked to judge the correct length of a line by comparing it with three sample lines. The experiment was set up so that there was an obviously correct answer. But Asch had riddled a group with a majority of stooges who deliberately chose the wrong answer. The pressure of the majority told on Asch’s volunteers. He found that 74% conformed with the wrong answer at least once, and 32% did so all the time.

What impulses were behind such conformism? Social psychologists have long considered that we construct our identities on the basis of others’ attitudes towards us. Erving Goffman, in The Presentation of Self in Everyday Life (1959), analysed social encounters as if each person was engaged in a dramatic performance, and suggested that each such actor was a creation of its audience.

Through such performances of self we internalise role expectations and gain positive self-esteem. We cast other individuals and groups in certain roles. Such behaviour may make some of us unconscious racists, but it also lubricates the wheels of social life.

French psychologist Serge Moscovici developed what is called social representation theory, arguing that shared beliefs and explanations held by a group of society help people to communicate effectively with one another. He explored the notion of anchoring, whereby new ideas or events in social life are given comforting redescriptions (or social representations). For example, a group of protesters against a motorway might be described demeaningly by the road lobby as a “rent-a-mob,” while the protesters themselves might anchor themselves more falteringly as “eco-warriors”.

* * *

Social psychologists have also been long-obsessed by the psychology of crowds. In 1895, French social psychologist Gustave le Bon described crowds as mobs in which individuals lost their personal consciences. His book, The Crowd: A Study of the Popular Mind, influenced Hitler and led many later psychologists to take a dim view of crowds.

After the war, German critical theorist Theodor Adorno wrote of the destructive nature of “group psychology.” Even as late as 1969, Stanford psychologist [and Situationist contributor] Philip Zimbardo argued that a process of deindividuation makes participants in crowds less rational.

Most recent crowd psychology has not been content to brand crowds necessarily irrational. Instead, it has divided into contagion theory (whereby crowds cause people to act in a certain way), convergence theory (where crowds amount to a convergence of already like-minded individuals) and emergent norm theory (where crowd behaviour reflects the desires of participants, but it is also guided by norms that emerge as the situation unfolds). . . .

In the age of MySpace, Facebook and online dating, group psychologists are now trying to find out what goes on when we present ourselves to the world online, how we are judged for doing so and how groups are formed online. Other social psychology touches on such voguish areas of research as social physics (which contends that physical laws might explain group behaviour) and neuroeconomics (which looks at the role of the brain when we evaluate decisions and interact with each other), but the age-old concerns remain part of our zeitgeist.

* * *

You can read the entire article here.   For a sample of Situationist posts examining the interaction of individuals and groups, see “The Situational Benefits of Outsiders,” Racism Meets Groupism and Teamism,” Racism Meets Groupism and Teamism,” “‘Us’ and ‘Them,’” “The Maverickiness Paradox,” “Four Failures of Deliberating Groups – Abstract,” “Team-Interested Decision Making,” “History of Groupthink,” “Some (Interior) Situational Sources War – Part I,” and “March Madness,” To read some of the previous Situationist posts describing or discussing classic experiments from soical psychology and related fields, click here.

Posted in Choice Myth, Classic Experiments, Conflict, Implicit Associations, Situationist Contributors, Social Psychology | Tagged: , , , , , , , , , , , , , , | 1 Comment »

Zimbardo on Milgram and Obedience – Part I

Posted by The Situationist Staff on April 14, 2009

Milgram Obedience ExperimentSituationist contributer Philip Zimbardo has authored the preface to a new edition of social psychologist Stanley Milgram’s pathbreaking and now-classic book Obedience to Authority.  This is the first of a two-part series derived from that preface.  In this post, Zimbardo describes the inculcation of obedience and Milgram’s role as a research pioneer.  In Part II, Zimbardo answers challenges to Milgram’s work and locates its legacy.

* * *

What is common about two of the most profound narratives in Western culture—Lucifer’s descent into Hell and Adam and Eve’s loss of Paradise—is the lesson of the dreadful consequences of one’s failure to obey authority. . . [T]hey are designed, as all parables are, to send a powerful message to all those who hear and read them: Obey authority at all costs! The consequences of disobedience to authority are formidable and damnable. Once created, these myths and parables get passed along by subsequent authorities, now parents, teachers, bosses, politicians, and dictators, among others, who want their word to be followed without dissent or challenge.

Thus, as school children, in virtually all traditional educational settings, the rules of law that we learned and lived were: Stay in your seat until permission is granted by the teacher to stand and leave it; do not talk unless given permission by the teacher to do so after having raised your hand to seek that recognition, and do not challenge the word of the teacher or complain. So deeply ingrained are these rules of conduct that even as we age and mature they generalize across many settings as permanent placards of our respect for authority. However, not all authority is just, fair, moral, and legal, and we are never given any explicit training in recognizing that critical difference between just and unjust authority.  The just one deserves respect and some obedience, maybe even without much questioning, while the unjust variety should arouse suspicion and distress, ultimately triggering acts of challenge, defiance, and revolution.

Stanley Milgram’s series of experiments on obedience to authority, so clearly and fully presented in this new edition of his work, represents some of the most significant investigations in all the social sciences of the central dynamics of this aspect of human nature. His work was the first to bring into the controlled setting of an experimental laboratory an investigation into the nature of obedience to authority. In a sense, he is following in the tradition of Kurt Lewin, although he is not generally considered to be in the Lewinian tradition, as Leon Festinger, Stanley Schachter, Lee Ross, and Richard Nisbett are, for example. Yet to study phenomena that have significance in their real world existence within the constraints and controls of a laboratory setting is at the essence of one of Lewin’s dictums of the way social psychology should proceed.

This exploration of obedience was initially motivated by Milgram’s reflections on the ease with which the German people obeyed Nazi authority in discriminating against Jews and, eventually, in allowing Hitler’s Final Solution to be enacted during the Holocaust. As a young Jewish man, he wondered if the Holocaust could be recreated in his own country, despite the many differences in those cultures and historical epochs. Though many said it could never happen in the United States, Milgram doubted whether we should be so sure. Believing in the goodness of people does not diminish the fact that ordinary, even once good people, just following orders, have committed much evil in the world.

British author C. P. Snow reminds us that more crimes against humanity have been committed in the name of obedience than disobedience. Milgram’s mentor, Solomon Asch, had earlier demonstrated the power of groups to sway the judgments of intelligent college students regarding false conceptions of visual reality. But that influence was indirect, creating a discrepancy between the group norm and the individual’s perception of the same stimulus event.

Conformity to the group’s false norm was the resolution to that discrepancy, with participants behaving in ways that would lead to group acceptance rather than rejection. Milgram wanted to discover the direct and immediate impact of one powerful individual’s commands to another person to behave in ways that challenged his or her conscience and morality. He designed his research paradigm to pit our general beliefs about what people would do in such a situation against what they actually did when immersed in that crucible of human nature.

* * *

We’ll post Part II of this series later this week.  You can review a sizeable collection of Situationist posts discussing the work of Stanley Milgram here.

Posted in Book, Classic Experiments, Conflict, Social Psychology | Tagged: , , , , | 4 Comments »

Bush, Cheney, Rumsfeld, and Tenet: “Guilty”

Posted by Philip Zimbardo on December 19, 2008

bush-tenetMore than 10,000 people cast their votes during the last year and a half in a virtual voting booth at www.LuciferEffect.com. Their judgments accord with the recent Senate Armed Services bipartisan report that blames Bush officials for detainee abuse. It also finds that the prison guards and interrogators were not the “true culprits.”

The vast majority of these voters found all four Bush officials guilty of having created the legal frameworks, laws, and motivational conditions that provided the foundation for the abuses and torture of detainees at Abu Ghraib and Guantanamo Bay prisons. The guilty verdicts (for George W. Bush, Dick Cheney, Donald Rumsfeld, and George Tenet) were true regardless of political preference, across all age groups, and whether or not they had read The Lucifer Effect book before voting.

Democrats were more likely to vote guilty than were those identified as Republicans, but even so, the majority of Republicans found each of the four officials guilty:

  • Bush: 95 % (Democrat) to 57% (Republican);
  • Cheney: 88% to 72%;
  • Rumsfeld: 89% to 72%;
  • Tenet: 83% to 70 %.

Those identified as “Other” political preference overwhelmingly gave guilty verdicts to all four:

  • 93% Bush;
  • 96% Cheney;
  • 95 % Rumsfeld, and
  • 89 % Tenet.

The percentage of guilty votes increased systematically with age of voters for all four officials: 86% of those under age 21 found George W. Bush guilty, as did 89% of those 21-40, 93 % of those 41-60, and a high of 97% for voters over the age of 60.

For Dick Cheney, the guilt verdicts were even higher at each age level, from 88% under 21, to 93% 21-40, to 97% 41-60, and a maximum of 99% for senior voters. Similar patterns can be seen for former Sec. of Defense Rumsfeld and former head of the CIA, Tenet.

My involvement with trying to understand the causes of the abuses and torture of Iraqi prisoners at Abu Ghraib began when I agreed to be part of the defense team organized by Gary Myers, legal council for one of the Army Reserve Military Police, Staff Sergeant Chip Frederick. In that role, I read all of the many investigative reports by various generals and one headed by James Schlesinger, former Sec. of Defense. I also read all of the relevant Human Rights Watch reports, International Red Cross reports, and more. I spoke with interrogators, military criminal investigators, and senior military officers who were on that scene. After in-depth interviews with Chip Frederick and reviewing his psychological evaluation by a military specialist, and his prior service record, I felt competent in rendering the judgment that he was a “good apple.” And further, that the conditions he and the other MPs were forced to work in and live in constituted the “Bad Barrel” that corrupted him and the other prison guards on the Tier 1A night shift (where all the abuses occurred).

These findings were summarized in two chapters of a book I wrote subsequently, Chapters 14 and 15 of The Lucifer Effect (Random House, 2007). While military justice put Frederick and many of the other MPs on trial for the abuses they had perpetrated on individuals they were supposed to protect while in their custody, none of the officers who should have been in charge were ever tried. Those abuses took place over more than three months in the fall of 2003 before being exposed. Command complicity involves responsibility for illegal or immoral behavior of one’s subordinates that officers should have known about – had they cared enough to be watching the store or the torture dungeon.

My summation to the military prosecutor in Frederick’s trial (2004) stated that although the soldier on trial was guilty of the abuses for which he was charged (for which he got an 8 year prison sentence), it was the Situation and the System that were also responsible. The Situation is the complex set of environmental circumstances in operation on the night shift in the interrogation center of Tier 1A—that created horrendous conditions for our soldiers as well as the detainees. The System includes those in charge of creating and maintaining those situations by means of resource allocation, legal rules, and top-down pressures for “actionable intelligence” by all means necessary.

I ended my conceptual analysis with a call for readers of my Lucifer Effect book to play the role of jurors in deciding on the guilt and accountability of some of the military command in charge at Abu Ghraib, along with Bush officials who were the ultimate Systems Managers. However, the World-Wide Web allows us to go beyond a rhetorical message of how one might vote in this case to creating a virtual voting booth where many people could openly register their vote on the guilt of the civilian officials whom they considered to be responsible for some of these abuses and tortures.

The summary of these votes by more than 10,000 people attest to the widespread public understanding that the abuses of human rights and integrity that have been perpetrated under the banner of protecting Homeland Security are traceable up to the highest levels of our government, and not just down to the foot soldiers doing their dirty work in the trenches of war. It is encouraging that the Senate Armed Services Committee also supports this viewpoint in blaming our leaders and not just the followers.

* * *

For related Situationist posts, see “Lessons Learned from the Abu Ghraib Horrors,” “The Devil You Know . . . ,” “Common Cause: Combating the Epidemics of Obesity and Evil,” “Person X Situation X System Dynamics,” “The Lucifer Effect Lecture at Harvard Law School,” “From Heavens to Hells to Heroes – Part I,” “From Heavens to Hells to Heroes – Part II,” and “Jonestown (The Situation of Evil) Revisited.”

Posted in Uncategorized | Tagged: , , , , , , , , | 3 Comments »

The Situational Power of Anonymity

Posted by The Situationist Staff on December 2, 2008

Sam Sommers has another first-rate situationist post, titled “Aggressive Drivers Anonymous” over on the Psychology Today Blog.  Here are some excerpts.

* * *

Last week I was driving my daughters to a birthday party when I pulled over at an intersection to let a fire engine through. Naturally, one driver, in a green Nissan, decided to use the speeding truck as his personal blocking back, tailing close behind and passing those of us who had pulled to the side. He made just enough progress before getting to the stoplight that I found myself totally cut off once the truck passed, forced to sit there and wait through yet another cycle of the light. I could have just let the transgression go, of course, but I felt an uncontrollable urge to honk my horn at Green Nissan as we waited at the red light.

In fact, I didn’t just give a quick honk of irritation. No, I gave him two distinct honks—the first a brief one to announce my presence with authority, the second a longer, drawn-out one to remind Green Nissan, as he waited at the light, that 1) he did something wrong, 2) I know he did something wrong, and 3) I know that he knows he did something wrong.

* * *

. . . [D]istracted, it took me a few minutes to realize that, two miles later, I was still right behind Green Nissan. And I had made a few turns since the fire truck went by. I started to get the sinking sensation that we might be headed to the same destination, a hypothesis further supported by the sight of a car seat in the back of his car as well.

Why this gave me an uneasy feeling, I cannot pinpoint precisely, as there were multiple factors at play. But of one thing I’m quite sure—the freedom I felt to honk my horn aggressively at Green Nissan from my safe, anonymous seat behind the windshield quickly dissipated at the mere thought of having a face-to-face encounter with him in an open parking lot.

Was I afraid of an actual physical confrontation? Not really. . . .

More likely, I was thinking about the fact that I really don’t enjoy confrontations of any type, my zealous horn-honking notwithstanding. Moreover, I don’t relish being thought of as a jerk, and it was beginning to dawn on me that this was probably precisely what Green Nissan thought of me. Maybe I hadn’t been a jerk per se, but had I overreacted at least a tad? Sure. And while all these thoughts were running through in my head, I found myself following Green Nissan through yet another pair of turns, one left, one right. I slumped a bit further down in my seat as I drove on.

* * *

The experience just served to crystallize for me how powerful it is to feel anonymous in a situation, particularly when it comes to the manifestation of aggression. As [Situationist contributor] Phil Zimbardo has written, feeling anonymous and deindividuated leads college students to administer greater levels of shocks to fellow student in laboratory studies. Along similar, albeit graver lines, perpetrators of violence, whether vigilante or state-sponsored in origin, often disguise themselves in hoods, masks, or make-up. And it’s no coincidence that the harshest, most aggressive verbal swipes taken in cyberspace usually come from anonymous sources as well.

Research even speaks directly to my very experiences on the road last week, illustrating that feelings of anonymity lead to increases in aggressive driving. In retrospect, that’s exactly how I felt behind the wheel when Green Nissan cut me off: anonymous. I knew he could catch a glimpse of me if he turned to look, but I assumed we were heading in different directions and I was never going to see him again. That liberated me to act in ways I’d never dream of in face-to-face interaction.

* * *

. . . . Just yet another demonstration of the power of subtle situational factors: It’s amazing how something as simple as sitting behind the wheel of a car can be enough to lead to such transformations in identity and behavior.

* * *

To read his entire post (with links), click here.  For some related Situationist posts, see “Deindividuation and Seung Hui Cho,” “The Devil You Know . . . ,” “The Social Awkwardness of Online Snubbing,” “Alone Together – The Commuter’s Situation,” and “Internet Disinhibition.”

Posted in Conflict, Social Psychology, Uncategorized | Tagged: , , , , , | 1 Comment »

Jonestown (The Situation of Evil) Revisited

Posted by Philip Zimbardo on November 17, 2008

With the 30th Anniversary of the Jonestown Mass Suicide upon us, now is a good time to republish the three-part Situationist series from 2007 on the “Situational Sources of Evil” — published also in the January/February 2007 edition of the Yale Alumni Magazine and based my  book, The Lucifer Effect: Understanding How Good People Turn Evil (Random House, March 2007).

* * *

Imagine that you have responded to an advertisement in the New Haven newspaper seeking subjects for a study of memory. AMilgram Advertisement researcher whose serious demeanor and laboratory coat convey scientific importance greets you and another applicant at your arrival at a Yale laboratory in Linsly-Chittenden Hall. You are here to help science find ways to improve people’s learning and memory through the use of punishment. The researcher tells you why this work may have important consequences. The task is straightforward: one of you will be the “teacher” who gives the “learner” a set of word pairings to memorize. During the test, the teacher will give each key word, and the learner must respond with the correct association. When the learner is right, the teacher gives a verbal reward, such as “Good” or “That’s right.” When the learner is wrong, the teacher is to press a lever on an impressive-looking apparatus that delivers an immediate shock to punish the error.

The shock generator has 30 switches, starting from a low level of 15 volts and increasing by 15 volts to each higher level. The experimenter tells you that every time the learner makes a mistake, you have to press the next switch. The control panel shows both the voltage of each switch and a description. The tenth level (150 volts) is “Strong Shock”; the 17th level (255 volts) is “Intense Shock”; the 25th level (375 volts) is “Danger, Severe Shock.” At the 29th and 30th levels (435 and 450 volts) the control panel is marked simply with an ominous XXX: the pornography of ultimate pain and power.

You and another volunteer draw straws to see who will play each role; you are to be the teacher, and the other volunteer will be the learner. He is a mild-mannered, middle-aged man whom you help escort to the next chamber. “Okay, now we are going to set up the learner so he can get some punishment,” the experimenter tells you both. The learner’s arms are strapped down and an electrode is attached to his right wrist.“Learner” being strapped in The generator in the next room will deliver the shocks. The two of you communicate over an intercom, with the experimenter standing next to you. You get a sample shock of 45 volts — the third level, a slight tingly pain — so you have a sense of what the shock levels mean. The researcher then signals you to start.

Initially, your pupil does well, but soon he begins making errors, and you start pressing the shock switches. He complains that the shocks are starting to hurt. You look at the experimenter, who nods to continue. As the shock levels increase in intensity, so do the learner’s screams, saying he does not think he wants to continue. You hesitate and question whether you should go on. But the experimenter insists that you have no choice.

James Monroe High SchoolIn 1949, seated next to me in senior class at James Monroe High School in the Bronx, New York, was my classmate, Stanley Milgram. We were both skinny kids, full of ambition and a desire to make something of ourselves, so that we might escape life in the confines of our ghetto experience. Stanley was the little smart one who we went to for authoritative answers. I was the tall popular one, the smiling guy other kids would go to for social advice.

I had just returned to Monroe High from a horrible year at North Hollywood High School, where I had been shunned and friendless (because, as I later learned, there was a rumor circulating that I was from a New York Sicilian Mafia family). Back at Monroe, I would be chosen “Jimmy Monroe” — most popular boy in Monroe High School’s senior class. Stanley and I once discussed how that transformation could happen. We agreed that I had not changed; the situation was what mattered.

Situational psychology is the study of the human response to features of our social environment, the external behavioral context, above all to the other people around us. Stanley Milgram and I, budding situationists in 1949, both went on to become academic social psychologists. We met again at Yale in 1960 as beginning assistant professors — him starting out at Yale, me at NYU. Some of Milgram’s new research wasStanley Milgram conducted in a modified laboratory that I had fabricated a few years earlier as a graduate student — in the basement of Linsly-Chittenden, the building where we taught Introductory Psychology courses. That is where Milgram was to conduct his classic and controversial experiments on blind obedience to authority.

Milgram’s interest in the problem of obedience came from deep personal concerns about how readily the Nazis had obediently killed Jews during the Holocaust. His laboratory paradigm, he wrote years later, “gave scientific expression to a more general concern about authority, a concern forced upon members of my generation, in particular upon Jews such as myself, by the atrocities of World War II.”

As Milgram described it, he hit upon the concept for his experiment while musing about a study in which one of his professors, Solomon Asch, had tested how far subjects would conform to the judgment of a group. Asch had put each subject in a group of coached confederates and asked every member, one by one, to compare a set of lines in order of length. When the confederates all started giving the same obviously false answers, 70 percent of the subjects agreed with them at least some of the time.

Milgram wondered whether there was a way to craft a conformity experiment that would be “more humanly significant” than judgments about line length. He wrote later: “I wondered whether groups could pressure a person into performing an act whose human import was more readily apparent; perhaps behaving aggressively toward another person, say by administering increasingly severe shocks to him. But to study the group effect . . . you’d have to know how the subject performed without any group pressure. At that instant, my thought shifted, zeroing in on this experimental control. Just how far would a person go under the experimenter’s orders?”

How far up the scale do you predict that you would go under those orders? Put yourself back in the basement with the fake shock apparatus and the other “volunteer” — actually the experimenter’s confederate, who always plays the learner because the “drawing” is rigged — strapped down in the next room. As the shocks proceed, the learner begins complaining about his heart condition. You dissent, but the experimenter still insists that you continue. The learner makes errors galore. You plead with your pupil to concentrate; you don’t want to hurt him. But your concerns and motivational messages are to no avail. He gets the answers wrong again and again. As the shocks intensify, he shouts out, “I can’t stand the pain, let me out of here!” Then he says toMilgram’s Subject 1 the experimenter, “You have no right to keep me here!” Another level up, he screams, “I absolutely refuse to answer any more! You can’t hold me here! My heart’s bothering me!”

Obviously you want nothing more to do with this experiment. You tell the experimenter that you refuse to continue. You are not the kind of person who harms other people in this way. You want out. But the experimenter continues to insist that you go on. He reminds you of the contract, of your agreement to participate fully. Moreover, he claims responsibility for the consequences of your shocking actions. After you press the 300-volt switch, you read the next keyword, but the learner doesn’t answer. “He’s not responding,” you tell the experimenter. You want him to go into the other room and check on the learner to see if he is all right. The experimenter is impassive; he is not going to check on the learner. Instead he tells you, “If the learner doesn’t answer in a reasonable time, about five seconds, consider it wrong,” since errors of omission must be punished in the same way as errors of commission — that is a rule.

As you continue up to even more dangerous shock levels, there is no sound coming from your pupil’s shock chamber. He may be unconscious or worse. You are truly disturbed and want to quit, but nothing you say works to get your exit from this unexpectedly distressing situation. You are told to follow the rules and keep posing the test items and shocking the errors.

Now try to imagine fully what your participation as the teacher would be. If you actuallyMilgram’s Subject 2 go all the way to the last of the shock levels, the experimenter will insist that you repeat that XXX switch two more times. I am sure you are saying, “No way would I ever go all the way!” Obviously, you would have dissented, then disobeyed and just walked out. You would never sell out your morality. Right?

Milgram once described his shock experiment to a group of 40 psychiatrists and asked them to estimate the percentage of American citizens who would go to each of the 30 levels in the experiment. On average, they predicted that less than 1 percent would go all the way to the end, that only sadists would engage in such sadistic behavior, and that most people would drop out at the tenth level of 150 volts. They could not have been more wrong.

In Milgram’s experiment, two of every three (65 percent) of the volunteers went all the way up to the maximum shock level of 450 volts. The vast majority of people shocked the victim over and over again despite his increasingly desperate pleas to stop. Most participants dissented from time to time and said they did not want to go on, but the researcher would prod them to continue.

Over the course of a year, Milgram carried out 19 different experiments, each one a different variation of the basic paradigm. In each of these studies he varied one social psychological variable and observed its impact. In one study, he added women; in others he varied the physical proximity or remoteness of either the experimenter-teacher link or the teacher-learner link; had peers rebel or obey before the teacher had the chance to begin; and more.

Milgram’s Bridgeport LocationIn one set of experiments, Milgram wanted to show that his results were not due to the authority power of Yale University. So he transplanted his laboratory to a run-down office building in downtown Bridgeport, Connecticut, and repeated the experiment as a project ostensibly of a private research firm with no connection to Yale. It made hardly any difference; the participants fell under the same spell of this situational power.

The data clearly revealed the extreme pliability of human nature: depending on the situation, almost everyone could be totally obedient or almost everyone could resist authority pressures. Milgram was able to demonstrate that compliance rates could soar to over 90 percent of people continuing to the 450-volt maximum or be reduced to less than 10 percent — by introducing just one crucial variable into the compliance recipe.

Want maximum obedience? Make the subject a member of a “teaching team,” in which the job of pulling the shock lever to punish the victim is given to another person (a confederate), while the subject assists with other parts of the procedure. Want resistance to authority pressures? Provide social models — peers who rebel. Participants also refused to deliver the shocks if the learner said he wanted to be shocked; that’s masochistic, and they are not sadists. They were also reluctant to give high levels of shock when the experimenter filled in as the learner. They were more likely to shock when the learner was remote than in proximity.

In each of the other variations on this diverse range of ordinary American citizens, of widely varying ages and occupations and of both genders, it was possible to elicit low, medium, or high levels of compliant obedience with a flick of the situational switch. Milgram’s large sample — a thousand ordinary citizens from varied backgrounds — makes the results of his obedience studies among the most generalizable in all the social sciences. His classic study has been replicated and extended by many other researchers in many countries.

Recently, Thomas Blass of the University of Maryland-Baltimore County [author of The Man Who Shocked The World and creator of the terrific website StanleyMilgram.Com] analyzed the rates of obedience in eight studies conducted in the United States and nine replications in European, African, and Asian countries. He found comparably high levels of compliance in all. The 61 percent mean obedience rate found in the U.S. was matched by the 66 percent rate found across all the other national samples. The degree of obedience was not affected by the timing of the studies, which ranged from 1963 to 1985.

Other studies based on Milgram’s have shown how powerful the obedience effect can be when legitimate authorities exercise their power within their power domains. In one Nurse Ratchedstudy, most college students administered shocks to whimpering puppies when required to do so by a professor. In another, all but one of 22 nurses flouted their hospital’s procedure by obeying a phone order from an unknown doctor to administer an excessive amount of a drug (actually a placebo); that solitary disobedient nurse should have been given a raise and a hero’s medal. In still another, a group of 20 high school students joined a history teacher’s supposed authoritarian political movement, and within a week had expelled their fellows from class and recruited nearly 200 others from around the school to the cause.

Now we ask the question that must be posed of all such research: what is its external validity, what are real-world parallels to the laboratory demonstration of authority power?

Hannah ArendtIn 1963, the social philosopher Hannah Arendt published what was to become a classic of our times, Eichmann in Jerusalem: A Report on the Banality of Evil. She provides a detailed analysis of the war crimes trial of Adolf Eichmann, the Nazi figure who personally arranged for the murder of millions of Jews. Eichmann’s defense of his actions was similar to the testimony of other Nazi leaders: “I was only following orders.” What is most striking in Arendt’s account of Eichmann is all the ways in which he seemed absolutely ordinary: half a dozen psychiatrists had certified him as “normal.” Arendt’s famous conclusion: “The trouble with Eichmann was precisely that so many were like him, and that the many were neither perverted nor sadistic, that they were, and still are, terribly and terrifyingly normal.”

Arendt’s phrase “the banality of evil” continues to resonate because genocide has been unleashed around the world and torture and terrorism continue to be common features of our global landscape. A few years ago, the sociologist and Brazil expert Martha Huggins, the Greek psychologist and torture expert Mika Haritos-Fatouros, and I interviewed several dozen torturers. These men did their daily dirty deeds for years in Brazil as policemen, sanctioned by the government to get confessions by torturing “subversive” enemies of the state.Voilence Workers

The systematic torture by men of their fellow men and women represents one of the darkest sides of human nature. Surely, my colleagues and I reasoned, here was a place where dispositional evil would be manifest. The torturers shared a common enemy: men, women, and children who, though citizens of their state, even neighbors, were declared by “the System” to be threats to the country’s national security — as socialists and Communists. Some had to be eliminated efficiently, while others, who might hold secret information, had to be made to yield it up by torture, confess and then be killed.

Torture always involves a personal relationship; it is essential for the torturer to understand what kind of torture to employ, what intensity of torture to use on a certain person at a certain time. Wrong kind or too little — no confession. Too much — the victim dies before confessing. In either case, the torturer fails to deliver the goods and incurs the wrath of the senior officers. Learning to determine the right kind and degree of torture that yields up the desired information elicits abounding rewards and flowing praise from one’s superiors. It took time and emerging insights into human weaknesses for these torturers to become adept at their craft.

What kind of men could do such deeds? Did they need to rely on sadistic impulses and a history of sociopathic life experiences to rip and tear the flesh of fellow beings day in and day out for years on end?

In a recent study of 400 al-Qaeda members, 90% came from caring, intact families.

We found that sadists are selected out of the training process by trainers because they are not controllable. They get off on the pleasure of inflicting pain, and thus do not sustain the focus on the goal of extracting confessions. From all the evidence we could muster, torturers were not unusual or deviant in any way prior to practicing their new roles, nor were there any persisting deviant tendencies or pathologies among any of them in the years following their work as torturers and executioners. Their transformation was entirely explainable as being the consequence of a number of situational and systemic factors, such as the training they were given to play this new role; their group camaraderie; acceptance of the national security ideology; and their learned belief in socialists and Communists as enemies of their state.

Young Bin Laden

Amazingly, the transformation of these men into violence workers is comparable to the transformation of young Palestinians into suicide bombers intent on killing innocent Israeli civilians. In a recent study, the forensic psychiatrist Marc Sageman [at the Solomon Asch Center] found evidence of the normalcy of 400 al-Qaeda members. Three-quarters came from the upper or middle class. Ninety percent came from caring, intact families. Two-thirds had gone to college; two-thirds were married; and most had children and jobs in science and engineering. In many ways, Sageman concludes, “these are the best and brightest of their society.”

Israeli psychologist Ariel Merari, who has studied this phenomenon extensively for many years, outlines the common steps on the path to these explosive deaths. First, senior members of an extremist group identify young people who, based on their declarations at a public rally against Israel or their support of some Islamic cause or Palestinian action, appear to have an intense patriotic fervor. Next, they are invited to discuss how seriously they love their country and hate Israel. They are asked to commit to being trained. Those who do then become part of a small secret cell of three to five youths. From their elders, they learn bomb making, disguise, and selecting and timing targets. Finally, they make public their private commitment by making a videotape, declaring themselves to be “the living martyr” for Islam. The recruits are also told the Big Lie: their relatives will be entitled to a high place in Heaven, and they themselves will earn a place beside Allah. Of course, the rhetoric of dehumanization serves to deny the humanity and innocence of their victims.Suicide Bombers

The die is cast; their minds have been carefully prepared to do what is ordinarily unthinkable. In these systematic ways a host of normal, angry young men and women become transformed into true believers. The suicide, the murder, of any young person is a gash in the fabric of the human family that we elders from every nation must unite to prevent. To encourage the sacrifice of youth for the sake of advancing the ideologies of the old must be considered a form of evil that transcends local politics and expedient strategies.

A host of normal, angry young men and women become transformed into true believers.

Our final extension of the social psychology of evil from artificial laboratory experiments to real-world contexts comes to us from the jungles of Guyana. There, on November 28, 1978, an American religious leader persuaded more than 900 of his followers to commit mass suicide. In the ultimate test of blind obedience to authority, many of them killed their children on his command.

Jim JonesJim Jones, the pastor of Peoples Temple congregations in San Francisco and Los Angeles, had set out to create a socialist utopia in Guyana. But over time Jones was transformed from the caring, spiritual “father” of a large Protestant congregation into an Angel of Death. He instituted extended forced labor, armed guards, semistarvation diets, and daily punishments amounting to torture for the slightest breach of any of his many rules. Concerned relatives convinced a congressman and media crew to inspect the compound. But Jones arranged for them to be murdered as they left. He then gathered almost all the members at the compound and gave a long speech in which he exhorted them to take their lives by drinking cyanide-laced Kool-Aid.

Jones was surely an egomaniac; he had all of his speeches and proclamations, even his torture sessions, tape-recorded — including his final suicide harangue. In it Jones distorts, lies, pleads, makes false analogies, appeals to ideology and to transcendent future life, and outright insists that his orders be followed, all while his staff is efficiently distributing deadly poison to the hundreds gathered around him. Some excerpts from that last hour convey a sense of the death-dealing tactics he used to induce total obedience to an authority gone mad:

Please get us some medication. It’s simple. It’s simple. There’s no convulsions with it. [Of course there are, especially for the children.] . . . Don’t be afraid to die. You’ll see, there’ll be a few people land[ing] out here. They’ll torture some of our children here. They’ll torture our people. They’ll torture our seniors. We cannot have this. . . . Please, can we hasten? Can we hasten with that medication? . . . We’ve lived — we’ve lived as no other people lived and loved. We’ve had as much of this world as you’re gonna get. Let’s just be done with it. (Applause.). . . . Who wants to go with their child has a right to go with their child. I think it’s humane. . . . Lay down your life with dignity. Don’t lay down with tears and agony. There’s nothing to death. . . . It’s just stepping over to another plane. Don’t beJonestown Massacre this way. Stop this hysterics. . . . Look, children, it’s just something to put you to rest. Oh, God. (Children crying.). . . . Mother, Mother, Mother, Mother, Mother, please. Mother, please, please, please. Don’t — don’t do this. Don’t do this. Lay down your life with your child.

And they did, and they died for “Dad.”

A fitting conclusion comes from psychologist Mahrzarin Banaji: “What social psychology has given to an understanding of human nature is the discovery that forces larger than ourselves determine our mental life and our actions — chief among these forces [is] the power of the social situation.”

The most dramatic instances of directed behavior change and “mind control” are not the consequence of exotic forms of influence such as hypnosis, psychotropic drugs, or “brainwashing.” They are, rather, the systematic manipulation of the most mundane aspects of human nature over time in confining settings. Motives and needs that ordinarily serve us well can lead us astray when they are aroused, amplified, or manipulated by situational forces that we fail to recognize as potent. This is why evil is so pervasive. Its temptation is just a small turn away, a slight detour on the path of life, a blur in our sideview mirror, leading to disaster.

Milgram crafted his research paradigm to find out what strategies can seduce ordinary citizens to engage in apparently harmful behavior. Many of these methods have parallels to compliance strategies used by “influence professionals” in real-world settings, such as salespeople, cult and military recruiters, media advertisers, and others. Below are ten of the most effective.

1

Prearranging some form of contractual obligation, verbal or written, to control the individual’s behavior in pseudo-legal fashion. In Milgram’s obedience study, subjects publicly agreed to accept the tasks and the procedures.

2

Giving participants meaningful roles to play — “teacher,” “learner” — that carry with them previously learned positive values and automatically activate response scripts.

3

Presenting basic rules to be followed that seem to make sense before their actual use but can then be used arbitrarily and impersonally to justify mindless compliance. The authorities will change the rules as necessary but will insist that rules are rules and must be followed (as the researcher in the lab coat did in Milgram’s experiment).

4

Altering the semantics of the act, the actor, and the action — replacing unpleasant reality with desirable rhetoric, gilding the frame so that the real picture is disguised: from “hurting victims” to “helping the experimenter.” We can see the same semantic framing at work in advertising, where, for example, bad-tasting mouthwash is framed as good for you because it kills germs and tastes like medicine.

5

Creating opportunities for the diffusion of responsibility or abdication of responsibility for negative outcomes, such that the one who acts won’t be held liable. In Milgram’s experiment, the authority figure, when questioned by a teacher, said he would take responsibility for anything that happened to the learner.

6

Starting the path toward the ultimate evil act with a small, seemingly insignificant first step, the easy “foot in the door” that swings open subsequent greater compliance pressures. In the obedience study, the initial shock was only a mild 15 volts. This is also the operative principle in turning good kids into drug addicts with that first little hit or sniff.

7

Having successively increasing steps on the pathway that are gradual, so that they are hardly noticeably different from one’s most recent prior action. “Just a little bit more.”

8

Gradually changing the nature of the authority figure from initially “just” and reasonable to “unjust” and demanding, even irrational. This tactic elicits initial compliance and later confusion, since we expect consistency from authorities and friends. Not acknowledging that this transformation has occurred leads to mindless obedience. And it is part of many date rape scenarios and a reason why abused women stay with their abusing spouses.

9

Making the exit costs high and making the process of exiting difficult; allowing verbal dissent, which makes people feel better about themselves, while insisting on behavioral compliance.

10

Offering a “big lie” to justify the use of any means to achieve the seemingly desirable, essential goal. (In Milgram’s research the justification was that science will help people improve their memory by judicious use of reward and punishment.) In social psychology nazi-propaganda.jpgexperiments, this is known as the “cover story”; it is a cover-up for the procedures that follow, which do not make sense on their own. The real-world equivalent is an ideology. Most nations rely on an ideology, typically “threats to national security,” before going to war or suppressing political opposition. When citizens fear that their national security is being threatened, they become willing to surrender their basic freedoms in exchange. Erich Fromm’s classic analysis in Escape from Freedom made us aware of this trade-off, which Hitler and other dictators have long used to gain and maintain power.

Procedures like these are used when those in authority know that few would engage in the endgame without being prepared psychologically to do the unthinkable. But people who understand their own impulses to join with a group and to obey an authority may be able also to withstand those impulses at times when the mandate from outside comes into conflict with their own values and conscience. In the future, when you are in a compromising position where your compliance is at issue, thinking back to these ten stepping-stones to mindless obedience may enable you to step back and not go all the way down the path — their path. A good way to avoid crimes of obedience is to assert one’s personal authority and to always take full responsibility for one’s actions. Resist going on automatic pilot, be mindful of situational demands on you, engage your critical thinking skills, and be ready to admit an error in your initial compliance and to say, “Hell, no, I won’t go your way.”

* * *

Below you can find several excellent videos of the Jonestown massacre and the circumstances leading up to it.

From PBS, here is a (fuzzy) 84-minute video “Jonestown: The Life And Death Of Peoples Temple.”

Here is a briefer but clearer 45-minute, video “Jonestown: The Final Report.”

Posted in Book, Classic Experiments, History, Morality, Social Psychology, Video | Tagged: , , , , , | Leave a Comment »

The Split Brain and the Interior Situation of Theories of the Self

Posted by The Situationist Staff on August 26, 2008

The following (5 minute) video demonstrates the effects of split brain surgery where the corpus collusum is severed. The effects are explained by Dr. Michael Gazzaniga.

From Youtube: “To reduce the severity of his seizures, Joe had the bridge between his left and right cerebral hemisphers (the corpus callosum) severed. As a result, his left and right brains no longer communicate through that pathway. Here’s what happens as a result.”

* * *

To watch a (3.5 minute) clip from Situationist contributor Phil Zimbardo’s program, Discovering Psychology, in whcih Michael Gazzaniga discusses the essential role of the “interpreter” in creating in each of us a unique sense of self.

* * *

Below you can watch an vintage (11 minute) video in which a very young Dr. Gazzaniga goes into detail regarding his early split-brain research on animals and humans (includes a fascinating example of how the right and left hands of a split-brain patient squabble with one another as if hands from two different individuals).

* * *

For a sample of related Situationist posts, see “Our Interior Situations – The Human Brain,” “Learning to Influence Our Interior Situation,” It’s All In Your (Theory of the) Mind,” “Smart People Thinking about People Thinking about People Thinking,” Vilayanur Ramachandran On Your Mind,”Jonathan Haidt on the Situation of Moral Reasoning,” “Unconscious Situation of Choice,” The Situation of Reason,” and Part I, Part II, Part III, and Part IV of “The Unconscious Situation of our Consciousness.”

Posted in Choice Myth, Classic Experiments, Neuroscience, Video | Tagged: , , , , , , , | 3 Comments »

The Situation of John Yoo and the Torture Memos

Posted by The Situationist Staff on May 13, 2008

Situationist friend Andrew Perlman recently published a terrific editorial in The National Law Journal on the situation of John Yoo, “The ‘Torture Memos’: Lessons for all of us.” Here are a few excerpts.

* * *

John Yoo - from WikipediaIt is easy to believe that John Yoo wrote his widely discredited “torture memos” because he holds radical views of presidential authority or because he has some unusual moral failing. The reality, however, may be far more ordinary and disturbing: He willfully followed the lead of White House officials who were eager to find a legal justification for torture. The banality of Yoo’s compliance shouldn’t excuse him in any way, but his mistakes can help us understand why attorneys might offer equally troubling legal advice in much less public settings.

We can draw some valuable insights in this regard from one of the most stunning social psychology experiments ever conducted. More than 40 years ago, Stanley Milgram found that, under the right conditions, an experimenter could successfully order more than 60% of adults to administer what they believed to be painful and dangerous electric shocks to an innocent, bound older man with a heart condition, despite the man’s repeated pleas to be let go. In essence, Milgram found that people are surprisingly likely to obey authority figures under certain conditions.

Social psychologists have identified many of the conditions that tend to promote this type of wrongful obedience . . . . [For a related list, see Zimbardo’s Situationist post here.]

Notably, the conditions that produced obedience in Milgram’s experiments probably also existed at the Office of Legal Counsel (OLC) when Yoo wrote his infamous memos.

* * *

Of course, none of this justifies Yoo’s conduct or excuses him in any way. Indeed, Jack Goldsmith, the subsequent head of the OLC, rescinded some of Yoo’s memos and successfully resisted the same pressures that Yoo faced. Nevertheless, by trying to understand why Yoo would have offered fundamentally wrong legal advice, we can gain insights into why other lawyers in more commonplace professional settings might offer similarly bad advice to powerful figures, whether they are White House officials, important law firm partners or corporate executives. Our obedience might take the form of following a client’s request to bury a discoverable “smoking gun” document or offering evidence that we know is false, but the forces at work are ultimately the same

* * *

To read all of Professor Perlman’s article, click here. In addition, we recommend his forthcoming law review article, “Unethical Obedience by Subordinate Attorneys: Lessons from Social Psychology” (forthcoming 36 Hofstra Law Review, 2007, available on SSRN), the abstract of which we have pasted below.

* * *

This Article explores the lessons that we can learn from social psychology regarding a lawyer’s willingness to comply with authority figures, such as senior partners or deep-pocketed clients, when they make unlawful or unethical demands. The Article reviews some of the basic literature in social psychology regarding conformity and obedience, much of which emphasizes the importance of context as a primary factor in predicting people’s behavior. The Article then contends that lawyers frequently find themselves in the kinds of contexts that produce high levels of conformity and obedience and low levels of resistance to illegal or unethical instructions. The result is that subordinate lawyers will find it difficult to resist a superior’s commands in circumstances that should produce forceful dissent. Finally, the Article proposes several changes to existing law in light of these insights, including giving lawyers the benefit of whistleblower protection, strengthening a lawyer’s duty to report the misconduct of other lawyers, and enhancing a subordinate lawyer’s responsibilities upon receiving arguably unethical instructions from a superiors.

* * *

To watch a fascinating 4-minute interview by Bill Moyers of Jack Goldsmith about the situation of the DOJ decision making, click on the video below.

To view a 45-minute video in which Jack Goldsmith discusses the surprising role of lawyers in the war on terror, click on the following video.

Vodpod videos no longer available.

For a sample of related posts, see the series by Situationist contributor Sung Hui Kim, “Why Do Lawyers Acquiesce In Their Clients’ Misconduct?, Part I, Part II, and Part III, and the post by Situationist contributor David Yosifon, “On the Ethical Obligations of Lawyers.”

Finally, to listen to a fascinating set of This American Life stories of “the Bush Administration, its unique style of asserting presidential authority, and its quest to redefine the limits of presidential power” generally, click here.

Posted in Abstracts, History, Law, Morality, Politics, Social Psychology, Uncategorized, Video | Tagged: , , , , , , , , , | 1 Comment »