The Situationist

I’m Objective, You’re Biased

Posted by Emily Pronin on October 24, 2007

Image by Marc Scheff - http://sketchbook.dangermarc.com/?page_id=359

Take a look around and you’ll witness bias virtually everywhere you turn. Even professionals and those charged with making objective decisions are surprisingly biased. Take, for example, the judgment-distorting influence of self-interest: Recent news reports have described the biasing effect on doctors of receiving gifts from pharmaceutical representatives, on “independent” financial auditors of taking paychecks from the companies they audit, on “representative” politicians of receiving support from their most “generous” constituents, and on research scientists of accepting funding from sponsors with an agenda. Depending on your perspective, one part or another of the media is obviously biased. In the sports world, referees and umpires often seem less than fully neutral. And on the bench, even robed judges sometimes appear to have one finger on the scale.

media-bias1.jpg

At the same time that the world around us seems overflowing with biased actors, we are often surprised, even offended, to have to defend our own judgments and decisions against the charge of bias – especially when we are certain that we have acted with the utmost integrity.

To each of us, it seems as clear that others are biased as it is plain that we are not.

My colleagues Daniel Lin, Lee Ross, and I referred to that asymmetry in perceptions of bias, whereby people often readily detect the presence of bias in others while at the same time denying it in themselves, as a “bias blind spot”.

In a recent series of experiments, graduate student Matthew Kugler and I tracked down a source of this phenomenon. It lies in the unconscious nature of bias, coupled with people’s unwarranted faith in the value of their conscious introspections (that is, their thoughts, motives, intentions, etc.), which people consult to assess their own bias.

Because bias tends to occur non-consciously, searching for it in one’s explicit thoughts is a little like looking for one’s car in the refrigerator. In assessing other peoples’ bias, however, we tend to look at their behavior. Although people’s actions are certainly an imperfect indicator of non-conscious bias, it’s about as close as one is generally going to get to peering into the garage.

Consider the following example. One of your colleagues just hired his old college buddy for the new associate position. In judging whether his decision was biased, your colleague might recall his conscious efforts to be objective in reviewing the pool of applicants, or he might simply recall not feeling any signs of bias clouding his assessments. From his perspective, he hired someone he liked and felt was best qualified for the position. While his introspections thus leave him confident that his selection was objectively made, his behavior would likely leave you confident in that it wasn’t.

One reason for those contrasting conclusions is that people have access to their own thoughts, feelings, and intentions, but are in the dark when it comes to those of others. Our own thoughts come to us as a bright beacon while those of others are a black box. A more subtle distinction, though, motivated our most recent research. We wondered if people’s focus on others’ actions, rather than their inner thoughts and intentions, reflected more than just a lack of introspective illumination. More specifically, we wanted to examine if it also reflected an asymmetry in people’s beliefs about the relative credibility of their own thoughts and intentions versus those of others.

In terms of the college-buddy example, that is, we predicted that knowing your coworker’s intention to be fair and his faith in the talents of an old pal (and knowing his absence of any willful bias) would not radically change your (or other onlookers’) perceptions of his bias. And that is what we found: people’s lack of reliance on others’ introspections is in part due to a diminished valuation of those introspections.

scales-of-perception.jpg

In one experiment, we had subjects take a purported social intelligence test, gave them negative feedback about their score, and then asked them to evaluate the quality of the test after first thinking aloud about it. Later, when those subjects were asked whether they had been biased in their evaluation of the test, they ignored their evaluation of the test and instead focused on their thoughts and motives. Those thoughts and motives yielded no signs of bias, and therefore the subjects assumed that their evaluations were objective. A separate group of subjects in that experiment did not take the test but instead observed a peer take it and evaluate its validity. Those observer subjects tended to take a different approach to assessing the presence of bias. Although they heard the test-takers’ thoughts, they ignored them. Instead, they looked to the test-takers’ behavior and, in particular, to whether the test takers claimed that the test was invalid right after performing poorly on it. Thus, they attended to a peer’s actions for assessing that peer’s bias, while the subjects themselves relied on internal information to assess their own bias.

In another experiment, we asked participants to report their susceptibility to various classic biases that distort human judgment. These included the self-serving bias (taking too much credit for successes and too little credit for failures), the fundamental attribution error (mistakenly viewing people’s outcomes as a function of their personality rather than their circumstances), and the halo effect (viewing people as positive on numerous dimensions when they are known to be positive on one). Those subjects showed the usual tendency to view themselves as less biased than their peers. When asked how they made their judgment, they reported considering their own intentions, but their peers’ behavior. In a follow-up experiment, we asked them how valuable they felt introspective information versus behavioral information would be in making judgments about bias. They reported that introspective information would be valuable in assessing their own bias, but believed that it would not be as valuable for other people to use introspective information when assessing their bias; rather, they suggested, others should look to behavior.

Inspired by those findings, we conducted another experiment to test a possible strategy for reducing people’s bias blind spot. That strategy involved teaching people that their judgments can be affected by processes (such as biases) that operate unconsciously. Such education, we reasoned, could help people to recognize their susceptibility to bias by preventing them from relying on introspective evidence of it. In our experiment, some subjects read a fabricated article informing them about the role of nonconscious processes in judgment, and about people’s lack of awareness when they are influenced by unconscious processes. Other subjects were in a control condition and did not read that article. Both groups read a filler article masking our true interests. Then, in an ostensibly separate experiment, participants were asked to indicate their personal susceptibility relative to their student peers to a variety of different judgmental biases. The result was that participants who had been educated about nonconscious processes (and the perils of relying on introspection) saw themselves as no more objective than their peers, unlike those in the control condition. The two conditions differed significantly from each other, indicating that the intervention reduced the bias blind spot.

People’s willingness to recognize their own biases is, of course, an important first step in prompting them to correct for and overcome those influences. Once people are able to recognize that they can be biased without knowing it, perhaps they can stop relying on their good intentions and introspectively clean consciences for evidence of their own freedom from biases that range from corrupt, to discriminatory, to unfairly conflictual behavior. From that more humble starting point, they may be more open to engaging in efforts to rid themselves of their own biases and to understanding how others can be biased without knowing it. Such efforts are not just scientifically sensible, they are socially wise.

* * *

For a sampling of scholarship related to topics discussed in this post, see the following:

For some previous Situationist posts on related topics, see “Mistakes Were Made (but not be me),” “Lima Beans–Yuch! (Why Wanting Not To Be Prejudiced May Not Be Enough),” “Self-Serving Biases,” “Captured Science,” “Unlevel Playing Fields: From Baseball Diamonds to Emergency Rooms,” “The Situation of Judging,” “Industry-Funded Research – Part I,” and “Industry-Funded Research – Part II.”

6 Responses to “I’m Objective, You’re Biased”

  1. […] Situationist (one of my new favorite blogs) posted about our ability to judge our own bias relative to others’. Because bias tends to occur non-consciously, searching for it in one’s explicit thoughts is a […]

  2. Electric Angel said

    One of my new favourite blogs too, this is a very interesting study and have to admit, it makes sense and now I realise that I’m not as objective or unbiased as I’d first like to thing, and have clearly been influenced to a (somewhat) biased decision in the past, through non-conscious processes.

  3. […] I’m Objective, You’re Biased […]

  4. […] look at Bias and how it affects people… I’m Objective, You’re Biased « The Situationist // “; document.write(s); // ]]> Project Wonderful – Your ad here, right now, for as low as […]

  5. […] The Situationist: “I’m Objective, You’re Biased“, which looks at “bias blind spots”–the extent to which many of us readily […]

  6. […] I’m Objective, You’re Biased (by Emily Pronin) […]

Leave a comment