The Situationist

Archive for March 3rd, 2007

Investing in Vice

Posted by The Situationist Staff on March 3, 2007

by  KURT ASPLAND / Special to Dallas Morning NewsToday, Phil Zimbardo posted here on ten effective methods for leading a person to “engage in apparently harmful behavior” — strategies that he said “have parallels to compliance strategies used by ‘influence professionals’ in real-world settings, such as salespeople, cult and military recruiters, media advertisers, andVice Fund Logo others.” Today on NPR, there was a story on All Things Considered, in which Debbie Elliot interviewed the Vice Fund’s portfolio manager, Charles Norton.

The coincidence of those events led The Situationist Staff to wonder: Could Zimbardo’s list of strategies help explain how Mr. Norton and others justify investment strategies that would otherwise seem immoral? We leave that to our readers to decide.

According to Norton, although the Vice Fund concentrates on “the Alcoholic beverages sector, tobacco, gaming, and aerospace and defense,” “[w]e’re not advocating these activities.” In fact, as Mr. Norton explained, “I don’t smoke, I drink only on occasion, [and] I rarely gamble.”

So why invest in just those products and not others?:

There are . . . five common threads that tie investments in these sectors together. One is there’s unvarying demand for their goods and services regardless of economic activity. They are global in nature – you know, people smoke and drink and gamble all over the world. They’re extremely profitable; there are high barriers to entry in these businesses. In the tobacco industry, advertising for cigarettes is illegal in most markets, so there is a powerful advantage to brands that have been around . . . . And one of the most important things that we like about these is that the government is a large beneficiary, particularly in gaming and tobacco. . . . The government has a financial incentive to make sure that these industries flourish.

Ms. Elliot returned several times to the question of why Mr. Norton felt no compunction regarding the harm caused by the products he was helping to finance. His responses included the following:

“The fact of the matter is that whether a company is selling a sneaker, a hamburger, or any other good, if it’s legally manufactured and sold, my job is just to analyze it – just to wear my analyst hat and look at the fundamentals.”

//www.fox11az.com/news/topstories/stories/kmsb-20060907-dnjc-vicefund.77d463b9.html“[A]ll we try to do” “is what is in the best interests of . . . [our] shareholders.”

“When you’re a serious investor, you have to check your emotions at the door. Emotions are the enemy when it comes to making sound investment decisions. So, we don’t come at this with any personal biases. We come at this as a purely objective analyst. And, in our perspective, those types of judgments have no place n the investment process.”

Asked if there was anything that he would not invest in, Norton found no place for scruples in his work. “I wouldn’t invest in companies that don’t have strong fundamentals. . . . It’ll be based on the financials and not my moral judgment . . . .”

Asked one more time how he managed to place moral concerns so completely aside, Norton responded:That’s what I’m trained to do. I’m a professional money manager. I have to remain objective at all times, because emotions will interfere with making a smart investment decision.”

For the last three years, the Vice Fund has earned an average return of roughly 19%, compared to the 10% return of the S&P. Although the Vice Fund does not advocate “vices,” the following images can be found on their website’s home page.

vice-images.jpg

Posted in Entertainment, Social Psychology | 1 Comment »

Situational Sources of Evil – Part III

Posted by Philip Zimbardo on March 3, 2007

//www.salon.com/media/col/shal/1999/09/27/persuaders/This is the third of a multi-part series of blog postings on the situational sources of evil. Parts of the series, including this post, are taken from an article in the most recent Yale Alumni magazine, which was adapted from my forthcoming book, The Lucifer Effect: Understanding How Good People Turn Evil (Random House, March 2007).

My first post summarized Stanley Milgram’s famous obedience experiments and some of the other related, more recent studies that it inspired. The second post summarized some of the real-world parallels to Milgram’s findings. This post describes ten lessons from the Milgram studies.

* * *

Milgram crafted his research paradigm to find out what strategies can seduce ordinary citizens to engage in apparently harmful behavior. Many of these methods have parallels to compliance strategies used by “influence professionals” in real-world settings, such as salespeople, cult and military recruiters, media advertisers, and others. Below are ten of the most effective.

1

Prearranging some form of contractual obligation, verbal or written, to control the individual’s behavior in pseudo-legal fashion. In Milgram’s obedience study, subjects publicly agreed to accept the tasks and the procedures.

2

Giving participants meaningful roles to play — “teacher,” “learner” — that carry with them previously learned positive values and automatically activate response scripts.

3

Presenting basic rules to be followed that seem to make sense before their actual use but can then be used arbitrarily and impersonally to justify mindless compliance. The authorities will change the rules as necessary but will insist that rules are rules and must be followed (as the researcher in the lab coat did in Milgram’s experiment).

4

Altering the semantics of the act, the actor, and the action — replacing unpleasant reality with desirable rhetoric, gilding the frame so that the real picture is disguised: from “hurting victims” to “helping the experimenter.” We can see the same semantic framing at work in advertising, where, for example, bad-tasting mouthwash is framed as good for you because it kills germs and tastes like medicine.

5

Creating opportunities for the diffusion of responsibility or abdication of responsibility for negative outcomes, such that the one who acts won’t be held liable. In Milgram’s experiment, the authority figure, when questioned by a teacher, said he would take responsibility for anything that happened to the learner.

6

Starting the path toward the ultimate evil act with a small, seemingly insignificant first step, the easy “foot in the door” that swings open subsequent greater compliance pressures. In the obedience study, the initial shock was only a mild 15 volts. This is also the operative principle in turning good kids into drug addicts with that first little hit or sniff.

7

Having successively increasing steps on the pathway that are gradual, so that they are hardly noticeably different from one’s most recent prior action. “Just a little bit more.”

8

Gradually changing the nature of the authority figure from initially “just” and reasonable to “unjust” and demanding, even irrational. This tactic elicits initial compliance and later confusion, since we expect consistency from authorities and friends. Not acknowledging that this transformation has occurred leads to mindless obedience. And it is part of many date rape scenarios and a reason why abused women stay with their abusing spouses.

9

Making the exit costs high and making the process of exiting difficult; allowing verbal dissent, which makes people feel better about themselves, while insisting on behavioral compliance.

10

Offering a “big lie” to justify the use of any means to achieve the seemingly desirable, essential goal. (In Milgram’s research the justification was that science will help people improve their memory by judicious use of reward and punishment.) In social psychology nazi-propaganda.jpgexperiments, this is known as the “cover story”; it is a cover-up for the procedures that follow, which do not make sense on their own. The real-world equivalent is an ideology. Most nations rely on an ideology, typically “threats to national security,” before going to war or suppressing political opposition. When citizens fear that their national security is being threatened, they become willing to surrender their basic freedoms in exchange. Erich Fromm’s classic analysis in Escape from Freedom made us aware of this trade-off, which Hitler and other dictators have long used to gain and maintain power.

Procedures like these are used when those in authority know that few would engage in the endgame without being prepared psychologically to do the unthinkable. But people who understand their own impulses to join with a group and to obey an authority may be able also to withstand those impulses at times when the mandate from outside comes into conflict with their own values and conscience. In the future, when you are in a compromising position where your compliance is at issue, thinking back to these ten stepping-stones to mindless obedience may enable you to step back and not go all the way down the path — their path. A good way to avoid crimes of obedience is to assert one’s personal authority and to always take full responsibility for one’s actions. Resist going on automatic pilot, be mindful of situational demands on you, engage your critical thinking skills, and be ready to admit an error in your initial compliance and to say, “Hell, no, I won’t go your way.”

*****See also Part I and Part II of this Series.*****

Posted in History, Social Psychology | 11 Comments »

 
%d bloggers like this: