Some (Interior) Situational Sources War – Part I
Posted by The Situationist Staff on June 1, 2007
In previous posts, Situationist contributors have examined some of the more common situational sources and mechanisms of inter-group conflict (see, for example, “Looking for the Evil Actor,” “The Situation of a Volunteer Army,” “March Madness,” and “Hoyas, Hos, & Gangstas”). In this series of posts, which is built around several particularly accessible articles, we will continue that theme with more specific focus on what social psychology and related fields might teach us about why nations or groups go to war.
We start the series with portions of an article in the January/February 2007 issue of Foreign Policy, co-authored by Nobel Laureate Daniel Kahneman and Jonathan Renshon, titled “Why Hawks Win.” Kahneman and Renshon describe how cognitive biases generally favor hawkish behavior in situations of international tension.
* * *
National leaders get all sorts of advice in times of tension and conflict. But often the competing counsel can be broken down into two basic categories. On one side are the hawks: They tend to favor coercive action, are more willing to use military force, and are more likely to doubt the value of offering concessions. When they look at adversaries overseas, they often see unremittingly hostile regimes who only understand the language of force. On the other side are the doves, skeptical about the usefulness of force and more inclined to contemplate political solutions. Where hawks see little in their adversaries but hostility, doves often point to subtle openings for dialogue.
As the hawks and doves thrust and parry, one hopes that the decision makers will hear their arguments on the merits and weigh them judiciously before choosing a course of action. Don’t count on it. Modern psychology suggests that policymakers come to the debate predisposed to believe their hawkish advisors more than the doves. There are numerous reasons for the burden of persuasion that doves carry, and some of them have nothing to do with politics or strategy. In fact, a bias in favor of hawkish beliefs and preferences is built into the fabric of the human mind.
Social and cognitive psychologists have identified a number of predictable errors (psychologists call them biases) in the ways that humans judge situations and evaluate risks. Biases have been documented both in the laboratory and in the real world, mostly in situations that have no connection to international politics. . . .
In fact, when we constructed a list of the biases uncovered in 40 years of psychological research, we were startled by what we found: All the biases in our list favor hawks. These psychological impulses—only a few of which we discuss here—incline national leaders to exaggerate the evil intentions of adversaries, to misjudge how adversaries perceive them, to be overly sanguine when hostilities start, and overly reluctant to make necessary concessions in negotiations. In short, these biases have the effect of making wars more likely to begin and more difficult to end.
. . . . Our conclusion is not that hawkish advisors are necessarily wrong, only that they are likely to be more persuasive than they deserve to be.
Several well-known laboratory demonstrations have examined the way people assess their adversary’s intelligence, willingness to negotiate, and hostility, as well as the way they view their own position. The results are sobering. Even when people are aware of the context and possible constraints on another party’s behavior, they often do not factor it in when assessing the other side’s motives. Yet, people still assume that outside observers grasp the constraints on their own behavior. With armies on high alert, it’s an instinct that leaders can ill afford to ignore.
Imagine, for example, that you have been placed in a room and asked to watch a series of student speeches on the policies of Venezuelan leader Hugo Chávez. You’ve been told in advance that the students were assigned the task of either attacking or supporting Chávez and had no choice in the matter. Now, suppose that you are then asked to assess the political leanings of these students. Shrewd observers, of course, would factor in the context and adjust their assessments accordingly. A student who gave an enthusiastic pro-Chávez speech was merely doing what she was told, not revealing anything about her true attitudes. In fact, many experiments suggest that people would overwhelmingly rate the pro-Chávez speakers as more leftist. Even when alerted to context that should affect their judgment, people tend to ignore it. Instead, they attribute the behavior they see to the person’s nature, character, or persistent motives. This bias is so robust and common that social psychologists have given it a lofty title: They call it the fundamental attribution error.
The effect of this failure in conflict situations can be pernicious. A policymaker or diplomat involved in a tense exchange with a foreign government is likely to observe a great deal of hostile behavior by that country’s representatives. Some of that behavior may indeed be the result of deep hostility. But some of it is simply a response to the current situation as it is perceived by the other side. What is ironic is that individuals who attribute others’ behavior to deep hostility are quite likely to explain away their own behavior as a result of being “pushed into a corner” by an adversary. The tendency of both sides of a dispute to view themselves as reacting to the other’s provocative behavior is a familiar feature of marital quarrels, and it is found as well in international conflicts. During the run-up to World War I, the leaders of every one of the nations that would soon be at war perceived themselves as significantly less hostile than their adversaries.
If people are often poorly equipped to explain the behavior of their adversaries, they are also bad at understanding how they appear to others. This bias can manifest itself at critical stages in international crises, when signals are rarely as clear as diplomats and generals believe them to be. Consider the Korean War, just one example of how misperception and a failure to appreciate an adversary’s assessment of intentions can lead to hawkish outcomes. In October 1950, as coalition forces were moving rapidly up the Korean Peninsula, policymakers in Washington were debating how far to advance and attempting to predict China’s response. U.S. Secretary of State Dean Acheson was convinced that “no possible shred of evidence could have existed in the minds of the Chinese Communists about the non-threatening intentions of the forces of the United Nations.” Because U.S. leaders knew that their intentions toward China were not hostile, they assumed that the Chinese knew this as well. Washington was, therefore, incapable of interpreting the Chinese intervention as a reaction to a threat. Instead, the Americans interpreted the Chinese reaction as an expression of fundamental hostility toward the United States. Some historians now believe that Chinese leaders may in fact have seen advancing Allied forces as a threat to their regime.
* * *
To read the entire article by Kahneman and Renshon, as well as a small debate their article helped to initiate, click here. Part II of this series will look at two other cognitive biases that Kahneman and Renshon argue help explain “why hawks win.”
This entry was posted on June 1, 2007 at 10:07 am and is filed under History, Politics, Public Policy, Social Psychology. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.