The Situationist

Mistakes Were Made (but not by me)

Posted by The Situationist Staff on October 19, 2007

mistakes-were-made-cover.jpgIn a previous post we recommended, and excerpted the introduction of, an excellent new book, Mistakes Were Made (but not by me),” written by social psychologists Carol Tavris and Elliot Aronson. In this post, we provide an overview of the entire book.

Mistakes Were Made is about the human propensity for “tenacious[ly] clinging to a discredited belief.” The authors explore a number of arenas in which we tend to make and stick with our mistakes, including politics, psychology, criminal law, and personal relationships. The authors use the image of the “pyramid of choice” to describe the process by which we fall into the trap of self-justification: we start at the top of the pyramid, with many choices and a view of all of them; as we slide down (after having made our mistake or initially accepted a mistaken belief) we lose that bird’s-eye view, and the other sides of the pyramid become obscured. In the end we’re left facing just our mistake and, without perspective, will defend our mistake with certainty.

//www.nytimes.com/imagepages/2007/03/17/weekinreview/20070318_WEEK_GRAPHIC.htmlThe book starts with a litany of brief descriptions of self-justifications made by well-known politicians, from the current president to Henry Kissinger. In describing dissonance theory and cognitive dissonance in more detail, the book describes an experiment in which students were invited to join a discussion group. They had to be interviewed before attending, and some were subjected to a difficult/embarrassing interview and others to an easy interview. In the end the discussion groups were all equally boring, but the students who had undergone the difficult interview were more likely to positively rate the group. The authors observe that these and other experiments show that if someone goes through a difficult experience in order to attain a goal, the goal will become more attractive.

To avoid cognitive dissonance, people perform mental gymnastics to maintain or strengthen their beliefs—unconsciously employing “confirmation bias” in order to dismiss or criticize any disconfirming evidence. This may be, in part, a neurological function: An experiment noted that the “reasoning areas of the brain” basically shut down when subjects were given dissonant political information. The need to preserve our self-concept is powerful and dissonance is filtered through our beliefs about who we are.

Chapter two opens with a discussion of “naïve realism” and particularly of the work of famed social psychologist Lee Ross. The authors review his work on Palestinian/Israeli conflict and also at Geoffrey Cohen‘s “experiment” with our own Democrat/Republican blind spots (where policies written by party A but labeled as coming from party B will still appeal to members of party B).

The authors argue that dissonance theory tells us that we are conditioned to justify our mistakes one small step at a time. The book looks at a few examples: Tom DeLay accepting a trip to St. Andrews, scientific blind spots in industry-funded research, Big Pharma’s influence on doctors. These “ego-preserving blind spots” divides into “us” vs. “them” while allowing people to believe that they aren’t prejudiced or biased. Our attachment to groups (us) are crucial to our identity but if we feel threatened our blind-spots are activiated (they are not smart or reasonable).

Chapter three focuses on the “dissonance-reducing distortions of memory.” Memories “spin the stories of our lives” and we tend to distort memories in “a self-enhancingabducted-by-aliens.jpg direction in all sorts of ways.” One example that struck me as being somewhat related to situationism (in spirit, even if it doesn’t quite fit any particular arguments) was the story of a man who claimed to have been abducted by aliens. On reflection, he realized that his memory of abduction was actually a result of his sleep deprivation & physical exhaustion after a very long bike ride. His immediate reaction was to overlook the situation and prefer an unreasonable explanation for his experiences.

Chapter four deals with the phenomenon of recovered-memory therapy in the 1980s and 90s and the rash of false accusations of sexual abuse that resulted. The examples focus on adults and children who, through psychotherapy and memory recovery, came to believe that they had been abused by teachers or relatives. In the end, many of these memories proved to be mistaken and inaccurate. Most of the therapists practicing this recovered-memory therapy were disconnected from the world of psychological research. Therapists had insisted that their clients had repressed memories of abuse, even when these clients initially denied any possibility of abuse. After tearing apart families and testifying in court to help convict accused sex offenders, the scientific evidence began to prove that horrible memories typically not repressed and that it can be particularly difficult to tell whether or not a child has been abused. Psychotherapists and psychiatrists who had practiced this therapy, though, were resolute and their backlash again scientific evidence now seems ridiculous. Those therapists who refused to admit any mistakes even went so far as to blame their clients. In the face of malpractice suits, one therapist even called for “an open season on academicians and researchers,” the source of disconfirming evidence. Finally, in an edition of a book that helped make recovered-memory therapy popular, the writers dismiss scientific evidence by claiming that it is “part of a backlash against child victims and incest survivors”—it’s them, not us.

Chapter five deals with mistakes made in the criminal law. This chapter discusses “external incentives” for denying mistakes (belief in the system, not wanting to be “soft on crime”) versus the internal ones that have been discussed elsewhere (“I’m a good, competent person”). The authors explore cases in which DNA evidence has later James Curtis Giles Exonerationexonerated a convicted person, and the reactions of those who worked for the original conviction. They also discuss interrogation techniques and how these techniques can create a “closed loop” of reasoning in the interrogator. Here the authors also take an interesting detour to explore how a suspect might eventually confess to a crime she didn’t commit because the interrogator creates dissonance by lying or making statements that conflict with what the suspect knows. Finally, the authors present the argument that the law and police training has failed to incorporate new research in cognition, perception and memory.

In Chapter six the authors move to personal relationships and marriage in particular. Here they note that a partner will often fail to recognize situation when praising or criticizing the other. But recognition of situation can be key to a successful relationship.

Chapter seven continues to explore personal relationships but also broadens the scope a bit. The authors argue that self-justification is more of a factor in cases where blameworthiness isn’t clear, and here situation is also more likely to be a key factor (using the Terri Schiavo case as an example). They also describe an interesting experiment in which subjects were asked to tell both a “victim story” and a “perpetrator story.” The experiment showed that self-justification turned more on situation (the role of victim or perpetrator) rather than on personality. The authors also discuss Abu Ghraib in terms of how we reduce dissonance by denying we do it and by justifying our reasons for it. They show how both individuals making decisions and an entire nation can come to accept a policy of torture. Once someone accepts that torture is acceptable in the context of a “ticking-time-bomb” it’s just a few steps more to accept what happened at Abu Ghraib.

In the end, the authors claim that we humans will do almost anything to reduce dissonance, including hurting others and supporting torture. They call for more transparency in organizations and institutions to reduce “blind spots” and they claim that individually we can fight the temptation to self-justify.

2 Responses to “Mistakes Were Made (but not by me)”

  1. […] some previous Situationist posts on related topics, see “Mistakes Were Made (but not be me),” “Lima Beans–Yuch! (Why Wanting Not To Be Prejudiced May Not Be Enough),” […]

  2. […] To read a related Situationist post, see “Mistakes Were Made (but not by me).” […]

Leave a comment