The Situationist

Posts Tagged ‘Evil’

Looking for the Evil Actor – Reposted

Posted by The Situationist Staff on March 10, 2012

Five years ago Jon Hanson and Michael McCann wrote and published the following post about Joseph Kony as part of a series on the the situational source of evil.  In light of the attention Kony is now getting (see Youtube video, “Kony 2012,” here or at bottom of this post), we thought it might be worth posting again.

* * *

In Parts I, II, and III of his recent posts on the Situational Sources of Evil, Phil Zimbardo makes the case that we too readily attribute to an evil person or group what should be, at least in part, attributed to situation. This was a key lesson of Milgram’s obedience experiments as well as Zimbardo’s Stanford Prison Experiment. And that lesson, unfortunately, seems similarly evident in far too many real-world atrocities.

Lords Resistance ArmyThere are numerous reasons, some of which those earlier posts highlighted, why the situationist lesson is an unpopular one. This post suggests another.

Think for a moment about the sort of evil that is so grotesquely apparent right now in The Sudan and Uganda, both of which are in the midst of civil wars–wars that have featured indescribably horrific acts, such as villages ravaged by soldiers who chop off limbs of children. Perhaps most harrowingly, the “evil-doers” are often children themselves, many of whom are kidnapped and then conscripted into bands of mutilating marauders.

Joseph Kony’s Lord Resistance Army, for example, is comprised mainly of abducted children who roam northern Uganda, where “many families have lost a child through abduction, or their village . . . [have been] attacked and destroyed, families burned out and/or killed, and harvests destroyed by . . . . the Lord’s Resistance Army.”

The plight of Ochola John, pictured below, exemplifies an all-too-common story: his hands, lips, nose, and ears were cut off by members of the Lord Resistance Army. It is a difficult image to take in (note, we opted against many other more graphic photos).

Such atrocities have led many in Uganda toOchola John question how children could become evil incarnate:

We don’t understand how Kony could have a child soldier slash a fellow child abductee with a machete or make a group of children bite their agemate with their bare teeth till he bleeds to death.

In searching for answers, some have turned to situationist factors:

It is easy to assume that the person who commits such an atrocity is deranged or even inhuman. Sometimes it is the case. But not always. It is possible for a normal individual to commit an abnormal, sick act just because of the situation s/he finds him/herself in, and the training s/he is exposed to.

How could this happen? Zimbardo’s ten-factor list suggests some of situationist grease that no doubt lubricates the wheels of evil. Kony’s methods and ideology are extreme, to be sure, but they are familiar: saving his country from evil by building a theocracy.

In that way, dispositionism can give way to a weak form of situationism, but only up to a point — a tendency that has elsewhere been called selective situationism or naive situationism. Kony’s evil disposition is the “situation” influencing the impressionable young boys. In the end, we place evil almost exclusively in one or a small number of actors — usually human, but sometimes supernatural. No doubt, Kony is immensely blameworthy, so much so that we, the authors, can scarcely bring ourselves even to suggest that the horrors might have multiple origins, beyond the gruesome actions of the most salient actors involved.

By locating evil ultimately in a person or group, we avoid a disconcerting possibility that there is more to the situation beyond the bad individuals. When evil comes packaged within a few human bodies, it is rendered more tractable, identifiable, and perhaps, in a way, less threatening — very “them,” and very “other.” Such a conception undermines the unsettling possibility that, because of the situation, there may be more “evil actors” behind those that we currently face. Get rid of the bad apples, we imagine, and the rest of the batch will be fine. Perhaps more important, it permits us to ignore the possibility that the barrel may be contaminating. We need not confront any apprehensions that our systems are unjust, the groups we identify with are contributing to or benefitting from that injustice, or that we individually play some causal role in it.

Joseph Kony is said to have abducted 20,000 kids in the last 20Joseph Kony years. But he has done so with minimal resistance from Uganda’s government, and with virtually no intervention from foreign powers.

Is there any line at which we non-salient bystanders of the world, including Americans, begin to bear some share of responsibility for suffering such as that endured by Ochola John? Maybe the answer is “no,” as most of us apparetly presume. But maybe it is “yes,” and maybe that line has already been crossed.

We are not making a foreign policy recommendation here. We are simply highlighting a form of blindness that we suspect influences all policy. That is, dispositionism (and motivated attributions generally) helps us push that line of responsibility toward, if not all the way to, the vanishing point — even if it does little to reduce the atrocities themselves. Dispositionism helps us to see the apple, or perhaps the tree, and to miss the orchard and the liberty-trade-centers-911.jpgforest and, perhaps, ourselves.

There are other examples of that tendency of allowing our attributions toward salient (and often despicable) individuals to eclipse any possibility of a more complex, far-reaching causal story. Our criminal justice system is partially built upon it. Consider, also, the widespread response to Susan Sontag’s infamous New Yorker essay, in which she described the of 9/11 terrorism not as

a “cowardly” attack on “civilization” or “liberty” or “humanity” or “the free world” but an attack on the world’s self-proclaimed super-power, undertaken as a consequence of specific American alliances and actions. . . . And if the word “cowardly” is to be used, it might be more aptly applied to those who kill from beyond the range of retaliation, high in the sky, than to those willing to die themselves in order to kill others. In the matter of courage (a morally neutral virtue): whatever may be said of the perpetrators of Tuesday’s slaughter, they were not cowards.

Regardless of the veracity of Sontag’s claims, many Americans did not want to hear such a non-affirming interpretation in the wake of the terror. She not only implicated American policies but suggested that perhaps the attackers were not as “beneath us” as many had portrayed.

As one of us summarized in another article (with Situationist contributors Adam Benforado and David Yosifon), many conservative commentators responded to Sontag and her claims with predictable rage and disgust (while most moderates and liberals took cover in the safety of silence).

Charles Krauthammer called Sontag “morally obtuse,” while Andrew Sullivan labeled her “deranged.”John Podhoretz claimed that she exemplified the “hate-America crowd,” that out-group of Americans who are “dripping with contempt for the nation’s politics, its leaders, its economic system and for their foolish fellow citizens.” And Rod Dreher really drove home the point saying that he wanted“to walk barefoot on broken glass across the Brooklyn Bridge, up to that despicable woman’s apartment, grab her by the neck, drag her down to ground zero and force her to say that to the firefighters.”

We see ourselves as “just,” and don’t like being “implicated” by clear injustice, a discomfort that is often assuaged by looking for the Evil Actor. But when evil continues, even after the evil individuals have been stopped, perhaps we glimpse one reason why, as George Santayana famously put it, “those who cannot remember the past are condemned to repeat it.”

* * *

* * *

Related Situationist posts:

One series of posts was devoted to the situational sources of war.

To review a larger sample of posts on the causes and consequences of human conflict, click here.

Posted in Conflict, History, Ideology, Morality, Politics, Social Psychology, System Legitimacy, Video | Tagged: , , , | Leave a Comment »

RADIOLAB on the Situation of Badness

Posted by The Situationist Staff on January 19, 2012

From RADIOLAB:

Cruelty, violence, badness… This episode of Radiolab, we wrestle with the dark side of human nature, and ask whether it’s something we can ever really understand, or fully escape.

We begin with a chilling statistic: 91% of men, and 84% of women, have fantasized about killing someone. We take a look at one particular fantasy lurking behind these numbers, and wonder what this shadow world might tell us about ourselves and our neighbors. Then, we reconsider what Stanley Milgrim’s famous experiment really revealed about human nature (it’s both better and worse than we thought). Next, we meet a man who scrambles our notions of good and evil: chemist Fritz Haber, who won a Nobel Prize in 1918…around the same time officials in the US were calling him a war criminal. And we end with the story of a man who chased one of the most prolific serial killers in US history, then got a chance to ask him the question that had haunted him for years: why?

Go to the RADIOLAB website to listen to the podcast.

Related Situationist posts:

Posted in Classic Experiments, Conflict, History, Morality, Podcasts, Social Psychology | Tagged: , , , | Leave a Comment »

Zimbardo’s Stanford Prison and Kingsfield’s Harvard Law

Posted by The Situationist Staff on October 31, 2011

Last week, Phil Zimbardo delivered another remarkable lecture at Harvard Law School — this time tracing his journey from studying evil to inspiring heroism.  We hope to post that video in several weeks.  For his introduction, Situationist Editor Jon Hanson assembled this short video comparing Professor Zimbardo’s Prison Experiment and Professor Kingsfield’s Harvard Law School (The Paper Chase), both of which reached their 40th anniversary this year.

Related Situationist posts:

 

Posted in Classic Experiments, Education, Events, Situationist Contributors, Social Psychology, Video | Tagged: , , , , | Leave a Comment »

The Milgram Experiment Yet Again (Again!)

Posted by The Situationist Staff on October 29, 2011

The Discovery channel’s CURIOSITY asks “How Evil Are You?” and replicates the Milgram experiment on Sunday, October 30, 2011 at 9PM e/p.

Posted in Classic Experiments, Social Psychology, Video | Tagged: , , | 3 Comments »

Evil No! Heroes Yes!! (Zimbardo returns to Harvard Law)

Posted by The Situationist Staff on October 23, 2011



Open to the public.

Related Situationist posts:

Posted in Events, Situationist Contributors, Social Psychology | Tagged: , , , | 1 Comment »

The Science of Evil

Posted by Adam Benforado on August 27, 2011

Following up on my review of Jon Ronson’s The Psychopath Test, I just finished reading the other new offering in the world of “psychopath studies”: Simone Baron-Cohen’s The Science of Evil: On Empathy and the Origins of Cruelty.

Baron-Cohen’s central theory is that evil is critically tied to lack of empathy.  It’s a thought-provoking notion and I was very intrigued by the connections that he made between various “empathy deficient” conditions from psychopaths, to narcissists, to borderlines, to those on the autism spectrum.

At points, I think he gets so carried away considering the particular dispositions of his “zero negatives” (those, like psychopaths, whose lack of empathy brings about “unequivocally bad” results) and “zero positives” (those, like Asperger’s sufferers, whose lack of empathy is not inherently harmful) that he misses the power of our situations to inform “evil” behavior.  Indeed, at these moments Baron-Cohen would have done well to pan out and emphasize that many of us (even those of us testing high on the Empathy Quotient questionnaire in the book’s Appendix) can be influenced to be less empathetic, with disastrous results.

These criticisms aside, and despite not feeling totally convinced by his argument, it’s an interesting book and worth a read.  I found myself continuing to ponder Baron-Cohen’s insights long after I’d set the book back on shelf.

One of these musings, I’ll share in my next post . . .

Posted in Book, Morality | Tagged: , , | 1 Comment »

Jonestown (The Situation of Evil) Revisited

Posted by Philip Zimbardo on November 17, 2008

With the 30th Anniversary of the Jonestown Mass Suicide upon us, now is a good time to republish the three-part Situationist series from 2007 on the “Situational Sources of Evil” — published also in the January/February 2007 edition of the Yale Alumni Magazine and based my  book, The Lucifer Effect: Understanding How Good People Turn Evil (Random House, March 2007).

* * *

Imagine that you have responded to an advertisement in the New Haven newspaper seeking subjects for a study of memory. AMilgram Advertisement researcher whose serious demeanor and laboratory coat convey scientific importance greets you and another applicant at your arrival at a Yale laboratory in Linsly-Chittenden Hall. You are here to help science find ways to improve people’s learning and memory through the use of punishment. The researcher tells you why this work may have important consequences. The task is straightforward: one of you will be the “teacher” who gives the “learner” a set of word pairings to memorize. During the test, the teacher will give each key word, and the learner must respond with the correct association. When the learner is right, the teacher gives a verbal reward, such as “Good” or “That’s right.” When the learner is wrong, the teacher is to press a lever on an impressive-looking apparatus that delivers an immediate shock to punish the error.

The shock generator has 30 switches, starting from a low level of 15 volts and increasing by 15 volts to each higher level. The experimenter tells you that every time the learner makes a mistake, you have to press the next switch. The control panel shows both the voltage of each switch and a description. The tenth level (150 volts) is “Strong Shock”; the 17th level (255 volts) is “Intense Shock”; the 25th level (375 volts) is “Danger, Severe Shock.” At the 29th and 30th levels (435 and 450 volts) the control panel is marked simply with an ominous XXX: the pornography of ultimate pain and power.

You and another volunteer draw straws to see who will play each role; you are to be the teacher, and the other volunteer will be the learner. He is a mild-mannered, middle-aged man whom you help escort to the next chamber. “Okay, now we are going to set up the learner so he can get some punishment,” the experimenter tells you both. The learner’s arms are strapped down and an electrode is attached to his right wrist.“Learner” being strapped in The generator in the next room will deliver the shocks. The two of you communicate over an intercom, with the experimenter standing next to you. You get a sample shock of 45 volts — the third level, a slight tingly pain — so you have a sense of what the shock levels mean. The researcher then signals you to start.

Initially, your pupil does well, but soon he begins making errors, and you start pressing the shock switches. He complains that the shocks are starting to hurt. You look at the experimenter, who nods to continue. As the shock levels increase in intensity, so do the learner’s screams, saying he does not think he wants to continue. You hesitate and question whether you should go on. But the experimenter insists that you have no choice.

James Monroe High SchoolIn 1949, seated next to me in senior class at James Monroe High School in the Bronx, New York, was my classmate, Stanley Milgram. We were both skinny kids, full of ambition and a desire to make something of ourselves, so that we might escape life in the confines of our ghetto experience. Stanley was the little smart one who we went to for authoritative answers. I was the tall popular one, the smiling guy other kids would go to for social advice.

I had just returned to Monroe High from a horrible year at North Hollywood High School, where I had been shunned and friendless (because, as I later learned, there was a rumor circulating that I was from a New York Sicilian Mafia family). Back at Monroe, I would be chosen “Jimmy Monroe” — most popular boy in Monroe High School’s senior class. Stanley and I once discussed how that transformation could happen. We agreed that I had not changed; the situation was what mattered.

Situational psychology is the study of the human response to features of our social environment, the external behavioral context, above all to the other people around us. Stanley Milgram and I, budding situationists in 1949, both went on to become academic social psychologists. We met again at Yale in 1960 as beginning assistant professors — him starting out at Yale, me at NYU. Some of Milgram’s new research wasStanley Milgram conducted in a modified laboratory that I had fabricated a few years earlier as a graduate student — in the basement of Linsly-Chittenden, the building where we taught Introductory Psychology courses. That is where Milgram was to conduct his classic and controversial experiments on blind obedience to authority.

Milgram’s interest in the problem of obedience came from deep personal concerns about how readily the Nazis had obediently killed Jews during the Holocaust. His laboratory paradigm, he wrote years later, “gave scientific expression to a more general concern about authority, a concern forced upon members of my generation, in particular upon Jews such as myself, by the atrocities of World War II.”

As Milgram described it, he hit upon the concept for his experiment while musing about a study in which one of his professors, Solomon Asch, had tested how far subjects would conform to the judgment of a group. Asch had put each subject in a group of coached confederates and asked every member, one by one, to compare a set of lines in order of length. When the confederates all started giving the same obviously false answers, 70 percent of the subjects agreed with them at least some of the time.

Milgram wondered whether there was a way to craft a conformity experiment that would be “more humanly significant” than judgments about line length. He wrote later: “I wondered whether groups could pressure a person into performing an act whose human import was more readily apparent; perhaps behaving aggressively toward another person, say by administering increasingly severe shocks to him. But to study the group effect . . . you’d have to know how the subject performed without any group pressure. At that instant, my thought shifted, zeroing in on this experimental control. Just how far would a person go under the experimenter’s orders?”

How far up the scale do you predict that you would go under those orders? Put yourself back in the basement with the fake shock apparatus and the other “volunteer” — actually the experimenter’s confederate, who always plays the learner because the “drawing” is rigged — strapped down in the next room. As the shocks proceed, the learner begins complaining about his heart condition. You dissent, but the experimenter still insists that you continue. The learner makes errors galore. You plead with your pupil to concentrate; you don’t want to hurt him. But your concerns and motivational messages are to no avail. He gets the answers wrong again and again. As the shocks intensify, he shouts out, “I can’t stand the pain, let me out of here!” Then he says toMilgram’s Subject 1 the experimenter, “You have no right to keep me here!” Another level up, he screams, “I absolutely refuse to answer any more! You can’t hold me here! My heart’s bothering me!”

Obviously you want nothing more to do with this experiment. You tell the experimenter that you refuse to continue. You are not the kind of person who harms other people in this way. You want out. But the experimenter continues to insist that you go on. He reminds you of the contract, of your agreement to participate fully. Moreover, he claims responsibility for the consequences of your shocking actions. After you press the 300-volt switch, you read the next keyword, but the learner doesn’t answer. “He’s not responding,” you tell the experimenter. You want him to go into the other room and check on the learner to see if he is all right. The experimenter is impassive; he is not going to check on the learner. Instead he tells you, “If the learner doesn’t answer in a reasonable time, about five seconds, consider it wrong,” since errors of omission must be punished in the same way as errors of commission — that is a rule.

As you continue up to even more dangerous shock levels, there is no sound coming from your pupil’s shock chamber. He may be unconscious or worse. You are truly disturbed and want to quit, but nothing you say works to get your exit from this unexpectedly distressing situation. You are told to follow the rules and keep posing the test items and shocking the errors.

Now try to imagine fully what your participation as the teacher would be. If you actuallyMilgram’s Subject 2 go all the way to the last of the shock levels, the experimenter will insist that you repeat that XXX switch two more times. I am sure you are saying, “No way would I ever go all the way!” Obviously, you would have dissented, then disobeyed and just walked out. You would never sell out your morality. Right?

Milgram once described his shock experiment to a group of 40 psychiatrists and asked them to estimate the percentage of American citizens who would go to each of the 30 levels in the experiment. On average, they predicted that less than 1 percent would go all the way to the end, that only sadists would engage in such sadistic behavior, and that most people would drop out at the tenth level of 150 volts. They could not have been more wrong.

In Milgram’s experiment, two of every three (65 percent) of the volunteers went all the way up to the maximum shock level of 450 volts. The vast majority of people shocked the victim over and over again despite his increasingly desperate pleas to stop. Most participants dissented from time to time and said they did not want to go on, but the researcher would prod them to continue.

Over the course of a year, Milgram carried out 19 different experiments, each one a different variation of the basic paradigm. In each of these studies he varied one social psychological variable and observed its impact. In one study, he added women; in others he varied the physical proximity or remoteness of either the experimenter-teacher link or the teacher-learner link; had peers rebel or obey before the teacher had the chance to begin; and more.

Milgram’s Bridgeport LocationIn one set of experiments, Milgram wanted to show that his results were not due to the authority power of Yale University. So he transplanted his laboratory to a run-down office building in downtown Bridgeport, Connecticut, and repeated the experiment as a project ostensibly of a private research firm with no connection to Yale. It made hardly any difference; the participants fell under the same spell of this situational power.

The data clearly revealed the extreme pliability of human nature: depending on the situation, almost everyone could be totally obedient or almost everyone could resist authority pressures. Milgram was able to demonstrate that compliance rates could soar to over 90 percent of people continuing to the 450-volt maximum or be reduced to less than 10 percent — by introducing just one crucial variable into the compliance recipe.

Want maximum obedience? Make the subject a member of a “teaching team,” in which the job of pulling the shock lever to punish the victim is given to another person (a confederate), while the subject assists with other parts of the procedure. Want resistance to authority pressures? Provide social models — peers who rebel. Participants also refused to deliver the shocks if the learner said he wanted to be shocked; that’s masochistic, and they are not sadists. They were also reluctant to give high levels of shock when the experimenter filled in as the learner. They were more likely to shock when the learner was remote than in proximity.

In each of the other variations on this diverse range of ordinary American citizens, of widely varying ages and occupations and of both genders, it was possible to elicit low, medium, or high levels of compliant obedience with a flick of the situational switch. Milgram’s large sample — a thousand ordinary citizens from varied backgrounds — makes the results of his obedience studies among the most generalizable in all the social sciences. His classic study has been replicated and extended by many other researchers in many countries.

Recently, Thomas Blass of the University of Maryland-Baltimore County [author of The Man Who Shocked The World and creator of the terrific website StanleyMilgram.Com] analyzed the rates of obedience in eight studies conducted in the United States and nine replications in European, African, and Asian countries. He found comparably high levels of compliance in all. The 61 percent mean obedience rate found in the U.S. was matched by the 66 percent rate found across all the other national samples. The degree of obedience was not affected by the timing of the studies, which ranged from 1963 to 1985.

Other studies based on Milgram’s have shown how powerful the obedience effect can be when legitimate authorities exercise their power within their power domains. In one Nurse Ratchedstudy, most college students administered shocks to whimpering puppies when required to do so by a professor. In another, all but one of 22 nurses flouted their hospital’s procedure by obeying a phone order from an unknown doctor to administer an excessive amount of a drug (actually a placebo); that solitary disobedient nurse should have been given a raise and a hero’s medal. In still another, a group of 20 high school students joined a history teacher’s supposed authoritarian political movement, and within a week had expelled their fellows from class and recruited nearly 200 others from around the school to the cause.

Now we ask the question that must be posed of all such research: what is its external validity, what are real-world parallels to the laboratory demonstration of authority power?

Hannah ArendtIn 1963, the social philosopher Hannah Arendt published what was to become a classic of our times, Eichmann in Jerusalem: A Report on the Banality of Evil. She provides a detailed analysis of the war crimes trial of Adolf Eichmann, the Nazi figure who personally arranged for the murder of millions of Jews. Eichmann’s defense of his actions was similar to the testimony of other Nazi leaders: “I was only following orders.” What is most striking in Arendt’s account of Eichmann is all the ways in which he seemed absolutely ordinary: half a dozen psychiatrists had certified him as “normal.” Arendt’s famous conclusion: “The trouble with Eichmann was precisely that so many were like him, and that the many were neither perverted nor sadistic, that they were, and still are, terribly and terrifyingly normal.”

Arendt’s phrase “the banality of evil” continues to resonate because genocide has been unleashed around the world and torture and terrorism continue to be common features of our global landscape. A few years ago, the sociologist and Brazil expert Martha Huggins, the Greek psychologist and torture expert Mika Haritos-Fatouros, and I interviewed several dozen torturers. These men did their daily dirty deeds for years in Brazil as policemen, sanctioned by the government to get confessions by torturing “subversive” enemies of the state.Voilence Workers

The systematic torture by men of their fellow men and women represents one of the darkest sides of human nature. Surely, my colleagues and I reasoned, here was a place where dispositional evil would be manifest. The torturers shared a common enemy: men, women, and children who, though citizens of their state, even neighbors, were declared by “the System” to be threats to the country’s national security — as socialists and Communists. Some had to be eliminated efficiently, while others, who might hold secret information, had to be made to yield it up by torture, confess and then be killed.

Torture always involves a personal relationship; it is essential for the torturer to understand what kind of torture to employ, what intensity of torture to use on a certain person at a certain time. Wrong kind or too little — no confession. Too much — the victim dies before confessing. In either case, the torturer fails to deliver the goods and incurs the wrath of the senior officers. Learning to determine the right kind and degree of torture that yields up the desired information elicits abounding rewards and flowing praise from one’s superiors. It took time and emerging insights into human weaknesses for these torturers to become adept at their craft.

What kind of men could do such deeds? Did they need to rely on sadistic impulses and a history of sociopathic life experiences to rip and tear the flesh of fellow beings day in and day out for years on end?

In a recent study of 400 al-Qaeda members, 90% came from caring, intact families.

We found that sadists are selected out of the training process by trainers because they are not controllable. They get off on the pleasure of inflicting pain, and thus do not sustain the focus on the goal of extracting confessions. From all the evidence we could muster, torturers were not unusual or deviant in any way prior to practicing their new roles, nor were there any persisting deviant tendencies or pathologies among any of them in the years following their work as torturers and executioners. Their transformation was entirely explainable as being the consequence of a number of situational and systemic factors, such as the training they were given to play this new role; their group camaraderie; acceptance of the national security ideology; and their learned belief in socialists and Communists as enemies of their state.

Young Bin Laden

Amazingly, the transformation of these men into violence workers is comparable to the transformation of young Palestinians into suicide bombers intent on killing innocent Israeli civilians. In a recent study, the forensic psychiatrist Marc Sageman [at the Solomon Asch Center] found evidence of the normalcy of 400 al-Qaeda members. Three-quarters came from the upper or middle class. Ninety percent came from caring, intact families. Two-thirds had gone to college; two-thirds were married; and most had children and jobs in science and engineering. In many ways, Sageman concludes, “these are the best and brightest of their society.”

Israeli psychologist Ariel Merari, who has studied this phenomenon extensively for many years, outlines the common steps on the path to these explosive deaths. First, senior members of an extremist group identify young people who, based on their declarations at a public rally against Israel or their support of some Islamic cause or Palestinian action, appear to have an intense patriotic fervor. Next, they are invited to discuss how seriously they love their country and hate Israel. They are asked to commit to being trained. Those who do then become part of a small secret cell of three to five youths. From their elders, they learn bomb making, disguise, and selecting and timing targets. Finally, they make public their private commitment by making a videotape, declaring themselves to be “the living martyr” for Islam. The recruits are also told the Big Lie: their relatives will be entitled to a high place in Heaven, and they themselves will earn a place beside Allah. Of course, the rhetoric of dehumanization serves to deny the humanity and innocence of their victims.Suicide Bombers

The die is cast; their minds have been carefully prepared to do what is ordinarily unthinkable. In these systematic ways a host of normal, angry young men and women become transformed into true believers. The suicide, the murder, of any young person is a gash in the fabric of the human family that we elders from every nation must unite to prevent. To encourage the sacrifice of youth for the sake of advancing the ideologies of the old must be considered a form of evil that transcends local politics and expedient strategies.

A host of normal, angry young men and women become transformed into true believers.

Our final extension of the social psychology of evil from artificial laboratory experiments to real-world contexts comes to us from the jungles of Guyana. There, on November 28, 1978, an American religious leader persuaded more than 900 of his followers to commit mass suicide. In the ultimate test of blind obedience to authority, many of them killed their children on his command.

Jim JonesJim Jones, the pastor of Peoples Temple congregations in San Francisco and Los Angeles, had set out to create a socialist utopia in Guyana. But over time Jones was transformed from the caring, spiritual “father” of a large Protestant congregation into an Angel of Death. He instituted extended forced labor, armed guards, semistarvation diets, and daily punishments amounting to torture for the slightest breach of any of his many rules. Concerned relatives convinced a congressman and media crew to inspect the compound. But Jones arranged for them to be murdered as they left. He then gathered almost all the members at the compound and gave a long speech in which he exhorted them to take their lives by drinking cyanide-laced Kool-Aid.

Jones was surely an egomaniac; he had all of his speeches and proclamations, even his torture sessions, tape-recorded — including his final suicide harangue. In it Jones distorts, lies, pleads, makes false analogies, appeals to ideology and to transcendent future life, and outright insists that his orders be followed, all while his staff is efficiently distributing deadly poison to the hundreds gathered around him. Some excerpts from that last hour convey a sense of the death-dealing tactics he used to induce total obedience to an authority gone mad:

Please get us some medication. It’s simple. It’s simple. There’s no convulsions with it. [Of course there are, especially for the children.] . . . Don’t be afraid to die. You’ll see, there’ll be a few people land[ing] out here. They’ll torture some of our children here. They’ll torture our people. They’ll torture our seniors. We cannot have this. . . . Please, can we hasten? Can we hasten with that medication? . . . We’ve lived — we’ve lived as no other people lived and loved. We’ve had as much of this world as you’re gonna get. Let’s just be done with it. (Applause.). . . . Who wants to go with their child has a right to go with their child. I think it’s humane. . . . Lay down your life with dignity. Don’t lay down with tears and agony. There’s nothing to death. . . . It’s just stepping over to another plane. Don’t beJonestown Massacre this way. Stop this hysterics. . . . Look, children, it’s just something to put you to rest. Oh, God. (Children crying.). . . . Mother, Mother, Mother, Mother, Mother, please. Mother, please, please, please. Don’t — don’t do this. Don’t do this. Lay down your life with your child.

And they did, and they died for “Dad.”

A fitting conclusion comes from psychologist Mahrzarin Banaji: “What social psychology has given to an understanding of human nature is the discovery that forces larger than ourselves determine our mental life and our actions — chief among these forces [is] the power of the social situation.”

The most dramatic instances of directed behavior change and “mind control” are not the consequence of exotic forms of influence such as hypnosis, psychotropic drugs, or “brainwashing.” They are, rather, the systematic manipulation of the most mundane aspects of human nature over time in confining settings. Motives and needs that ordinarily serve us well can lead us astray when they are aroused, amplified, or manipulated by situational forces that we fail to recognize as potent. This is why evil is so pervasive. Its temptation is just a small turn away, a slight detour on the path of life, a blur in our sideview mirror, leading to disaster.

Milgram crafted his research paradigm to find out what strategies can seduce ordinary citizens to engage in apparently harmful behavior. Many of these methods have parallels to compliance strategies used by “influence professionals” in real-world settings, such as salespeople, cult and military recruiters, media advertisers, and others. Below are ten of the most effective.

1

Prearranging some form of contractual obligation, verbal or written, to control the individual’s behavior in pseudo-legal fashion. In Milgram’s obedience study, subjects publicly agreed to accept the tasks and the procedures.

2

Giving participants meaningful roles to play — “teacher,” “learner” — that carry with them previously learned positive values and automatically activate response scripts.

3

Presenting basic rules to be followed that seem to make sense before their actual use but can then be used arbitrarily and impersonally to justify mindless compliance. The authorities will change the rules as necessary but will insist that rules are rules and must be followed (as the researcher in the lab coat did in Milgram’s experiment).

4

Altering the semantics of the act, the actor, and the action — replacing unpleasant reality with desirable rhetoric, gilding the frame so that the real picture is disguised: from “hurting victims” to “helping the experimenter.” We can see the same semantic framing at work in advertising, where, for example, bad-tasting mouthwash is framed as good for you because it kills germs and tastes like medicine.

5

Creating opportunities for the diffusion of responsibility or abdication of responsibility for negative outcomes, such that the one who acts won’t be held liable. In Milgram’s experiment, the authority figure, when questioned by a teacher, said he would take responsibility for anything that happened to the learner.

6

Starting the path toward the ultimate evil act with a small, seemingly insignificant first step, the easy “foot in the door” that swings open subsequent greater compliance pressures. In the obedience study, the initial shock was only a mild 15 volts. This is also the operative principle in turning good kids into drug addicts with that first little hit or sniff.

7

Having successively increasing steps on the pathway that are gradual, so that they are hardly noticeably different from one’s most recent prior action. “Just a little bit more.”

8

Gradually changing the nature of the authority figure from initially “just” and reasonable to “unjust” and demanding, even irrational. This tactic elicits initial compliance and later confusion, since we expect consistency from authorities and friends. Not acknowledging that this transformation has occurred leads to mindless obedience. And it is part of many date rape scenarios and a reason why abused women stay with their abusing spouses.

9

Making the exit costs high and making the process of exiting difficult; allowing verbal dissent, which makes people feel better about themselves, while insisting on behavioral compliance.

10

Offering a “big lie” to justify the use of any means to achieve the seemingly desirable, essential goal. (In Milgram’s research the justification was that science will help people improve their memory by judicious use of reward and punishment.) In social psychology nazi-propaganda.jpgexperiments, this is known as the “cover story”; it is a cover-up for the procedures that follow, which do not make sense on their own. The real-world equivalent is an ideology. Most nations rely on an ideology, typically “threats to national security,” before going to war or suppressing political opposition. When citizens fear that their national security is being threatened, they become willing to surrender their basic freedoms in exchange. Erich Fromm’s classic analysis in Escape from Freedom made us aware of this trade-off, which Hitler and other dictators have long used to gain and maintain power.

Procedures like these are used when those in authority know that few would engage in the endgame without being prepared psychologically to do the unthinkable. But people who understand their own impulses to join with a group and to obey an authority may be able also to withstand those impulses at times when the mandate from outside comes into conflict with their own values and conscience. In the future, when you are in a compromising position where your compliance is at issue, thinking back to these ten stepping-stones to mindless obedience may enable you to step back and not go all the way down the path — their path. A good way to avoid crimes of obedience is to assert one’s personal authority and to always take full responsibility for one’s actions. Resist going on automatic pilot, be mindful of situational demands on you, engage your critical thinking skills, and be ready to admit an error in your initial compliance and to say, “Hell, no, I won’t go your way.”

* * *

Below you can find several excellent videos of the Jonestown massacre and the circumstances leading up to it.

From PBS, here is a (fuzzy) 84-minute video “Jonestown: The Life And Death Of Peoples Temple.”

Here is a briefer but clearer 45-minute, video “Jonestown: The Final Report.”

Posted in Book, Classic Experiments, History, Morality, Social Psychology, Video | Tagged: , , , , , | Leave a Comment »

 
%d bloggers like this: