Some (Interior) Situational Sources War – Part IV
Posted by The Situationist Staff on June 8, 2007
This series is devoted to highlighting some of the psychological tendencies that encourage individuals and groups to enter conflicts and wars that they later regret. Part I and Part II of the series included portions of an article co-authored by Daniel Kahneman and Jonathan Renshon, titled “Why Hawks Win.” Part III reproduced an op-ed written by Situationist friend Dan Gilbert on July 24, 2006. This Part includes the first half of an essay written by Situationist Contributor, Jon Hanson within the week following 9/11. The essay has not previously been published — and should be read with its context in mind.
* * *
Count to Twelve: Some Lessons from Social Cognition Theory – Part A
by Jon Hanson (written several days after 9/11/01)
Now we will count to twelve
and we will all keep still.
For once on the face of the earth,
let’s not speak in any language;
let’s stop for a second,
and not move our arms so much.
~ Pablo Neruda
At times, we all seem to recognize that violent acts borne of retaliatory fervor lead to significant regret. And those times, unfortunately, are when we least need the wisdom of the truism. When we are not ourselves feeling the pull of outrage or the pressure of hate’s grip, we dispassionately call for peace and restraint in other countries, whose violence has ebbed and flowed for generations, and we shake our heads in disbelief that those involved don’t see how fruitless each side’s insatiable hunger is to settle the score. We remember with shame the lynch-mob “justice” that stains this country’s history. We are moved by even the dramatized cycles that killed Shakespeare’s Romeo and Juliet and, more recently, Sondheim’s Shark and Jet. We look with pride upon our legal system that is, albeit imperfectly, designed to discourage retaliatory self-help and to offer those who might otherwise be victims of vicious scapegoating some protective shelter. We nod with knowing affirmation when we read the wisdom of our wisest leaders, such as Ghandi’s, “An eye for an eye makes the whole world blind.” And we purport to embrace religious teachings that seem to call us toward peace and peacemaking — even in the face of aggression.
There are several reasons why that basic message is so central and so commonly repeated and reinforced by our leaders and institutions (both cultural and political). First, people who don’t abide by it usually regret that they don’t. Thus, it is an extremely valuable message, one that will save people much pain, if only they can follow it. Second, because of the cyclical and social nature of vengeful violence, the regret often extends beyond the one individual who initially retaliates, giving social institutions a stake in our personal decisions. And, third, the message is far easier to deliver than it is to heed — easier said than done, easier preached than practiced. Institutions, narratives, authorities repeatedly call on us to do the hard thing in an effort to help bolster whatever flimsy resolve individuals can muster on their own. On why those things are true “social cognition theory,” the burgeoning field of psychology that examines how we understand ourselves and others, has much to teach us.
Think of the physical response associated with, say, being stung by a bee. There is an urgent desire to end the suddenly inflicted pain, and an unexamined and unconscious assumption that if the little bastard who caused it can be discovered and killed, then the pain can be significantly lessened. The greater the pain is, the greater is the urgency of the search-and-destroy mission and the greater the willingness to strike out wildly with flailing arms in an effort to stop the stinging. That response is automatic, not reasoned. And very few of us can will ourselves to be still and maintain our concentration when being stung by a bee. Our natural response is adaptive; it generally is good to jump and swat and thereby attempt to end intense pain quickly. Nonetheless, its reflexive nature means that it can often cause more pain than it eases. So it is, for example, that being stung by a bee while driving a car is extremely dangerous — not just for the bee. Steering and flailing don’t mix.
Such a topical sting is in some ways analogous to the deeper emotional (and often physical) sting that an individual feels following a tragedy in which the culprit has acted maliciously. And, again, our reflexive reaction to strike out against the harmdoer can often hinder our ability to make good decisions for ourselves. The problem with the analogy is that, with the emotional sting, our ability to see the harmful effects of our flailing arms is seriously compromised. It is as if we lose sight of the hazards on the road and perceive violent retaliation as the most logical and appropriate course. How does that happen?
First, the sting leaves us with what social cognition theorists call a “directional goal” — which, loosely, means that we access beliefs, rules, and evidence that permit us to “reason” to our preferred conclusion. Such “motivated reasoning,” as it is sometimes called, is rampant in human decision making and leads to a host of well-documented cognitive distortions, including optimism bias, confirmatory bias, and perseverance bias. Psychologists studying the confirmatory bias have shown that people often interpret evidence as providing support for their preferred conclusion, even if the evidence provided is ambiguous or, indeed, even if it objectively supports the opposite conclusion.
In a classic experiment students who supported, and students who opposed, the death penalty were asked to read the same stack of articles — containing arguments and evidence both for and against capital punishment. Strikingly, whatever a student’s initial position (for or against), that position was strengthened by the articles. In other words, the students saw what they wanted to see in the evidence. Although most of us tend to be quick to identify such biases in other people’s thinking, we rarely identify it in our own. Probably the clearest recent examples of the phenomenon to occur on a national scale, was during the Bush-Gore debates of last year. Polls following those debates showed, time and again, that individuals who favored Gore (Bush) going into the debate were very likely to believe that Gore (Bush) was more effective during the debate. And, of course, legal scholars and commentators have suggested that the same sort of phenomenon explained the Supreme Court’s notorious 5-4 split in the decision that ultimately decided that election.
Although the distorting effect of directional goals can be found virtually anywhere there is human reasoning, it appears to be especially acute when that reasoning takes place within a certain mindset. Loosely speaking, the “deliberative” mindset, which people adopt when assessing their various options, strategies, or approaches, leads people to obtain and process information in a relatively thorough and neutral manner. The “implemental” mindset, which people adopt once they have decided on a general strategy, leads people to pick and choose among the information in a way that suits their goal and to gain a false sense of optimism that their goal can be achieved. In other words, the choice to implement a strategy or take a particular approach to a problem, creates a strong motive in the individual and has a significant effect on that person’s reasoning. And the evidence reveals that individuals adopting an implemental mindset more seriously exaggerate the extent to which they can control uncontrollable events and more seriously underestimate their own vulnerability to risks.
Of course, there are limits to how far a directional goal can influence an individual’s reasoning process — we are, in the words of one social cognition theorist, “unreasonable within reason.” But it is also true that the greater a person’s motivation (or the more intense the sting) and the more clearly that people are acting within an implemental mindset, the more willing they are to engage in self-serving judgments or to act with nearly blind disregard for evidence that contradicts their preferred view. Indeed, at times motivational factors are so strong that we act in conscious disregard of our own preferences — against our own will.
The basic lesson of that sort of evidence is simple: we are not as rational, logical, and reasoned as we think we are. And that problem, and our self-deceit, is particularly acute when we have a powerful, visceral, affective urge to reach a certain conclusion and to act in response to that plan.
Those insights have powerful implications for understanding how we are thinking about and responding to last week’s terrorist acts. Our rage and our desire to retaliate — to “smoke out” the perpetrators, capture them, and bring them back, “dead or alive” — all contribute to our inability to make good decisions. We should be very worried that instead of acting according to reason, we are simply generating reasons to justify our preferred actions.
The sting of vengeful anger also weakens what social cognition theorists call “accuracy goals” of reasoning — that is, the goal of coming to the most accurate assessment and justified conclusion possible. The accuracy goal comes into play when the decision maker feels accountable to others and leads decision makers to search more extensively for better reasoning strategies and to invest more in the judgment-making process. Such efforts have been shown generally to improve judgments — meaning, in part, that the judgments are based less on first impressions, stereotypes, and cognitive biases.
A problem with the shared rage that characterizes this country right now is that any concern about accountability seems to counsel in favor of swift and dramatic retaliation. Emotion and accountability thus point in the same direction, reducing the need for accuracy. In addition, the felt urge for punishing the wrongdoers, like the desire to squash a stinging bee, is immediate — greatly curtailing the ability to engage in deliberations. Evidence reveals that such urgency, and the pressure it creates for coming to a decision (“the closure goal”), tends to stifle our thinking processes as soon as we arrive at the first seemingly palatable conclusion.
What is more, all of these factors lead us to rely more heavily on false stereotypes and exhibit more clearly a general phenomenon that social psychologists refer to as “actor-observer bias.” Broadly speaking, when something bad happens to a person, that person is more likely to attribute it to external or situational factors than she otherwise would. And when something good happens to the person, she is more likely than otherwise to assume that internal or dispositional factors were the cause. In a sense, people tend to take credit for the good, but not for the bad, in their own lives. That tendency is reversed when we consider the good and bad experienced or exhibited by other people. We tend to attribute bad things that happen to them to dispositional factors and the good things to situational factors. This bias is, in part, simply a function of motivated reasoning — a desire to see ourselves in a relatively good light. But the bias is also the consequence of the fact that it typically takes a significant cognitive investment to try to understand the situational factors that might partially account for another person’s behavior. When a person’s scarce cognitive resources are already heavily taxed, as they tend to be when the person is feeling outrage, the chances for misunderstanding the underlying causes — and, hence, generating effective solutions — are greatly diminished.
Those are a few of the many reasons to be worried about how we are, as individuals, “thinking.”
* * *
The second part of this essay will look at the way group thinking can enhance biases and contribute further to bellicosity.
This entry was posted on June 8, 2007 at 2:00 pm and is filed under History, Politics, Public Policy, Social Psychology. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.