I’m Objective, You’re Biased
Posted by Emily Pronin on March 6, 2008
[This post was first published in October. It is being re-published this week because of its relevance to this Saturday’s conference at Harvard Law School, hosted by the Project on Law & Mind Sciences (for details, go to the conference webpage here).]
Take a look around and you’ll witness bias virtually everywhere you turn. Even professionals and those charged with making objective decisions are surprisingly biased. Take, for example, the judgment-distorting influence of self-interest: Recent news reports have described the biasing effect on doctors of receiving gifts from pharmaceutical representatives, on “independent” financial auditors of taking paychecks from the companies they audit, on “representative” politicians of receiving support from their most “generous” constituents, and on research scientists of accepting funding from sponsors with an agenda. Depending on your perspective, one part or another of the media is obviously biased. In the sports world, referees and umpires often seem less than fully neutral. And on the bench, even robed judges sometimes appear to have one finger on the scale.
At the same time that the world around us seems overflowing with biased actors, we are often surprised, even offended, to have to defend our own judgments and decisions against the charge of bias – especially when we are certain that we have acted with the utmost integrity.
To each of us, it seems as clear that others are biased as it is plain that we are not.
My colleagues Daniel Lin, Lee Ross, and I referred to that asymmetry in perceptions of bias, whereby people often readily detect the presence of bias in others while at the same time denying it in themselves, as a “bias blind spot”.
In a recent series of experiments, graduate student Matthew Kugler and I tracked down a source of this phenomenon. It lies in the unconscious nature of bias, coupled with people’s unwarranted faith in the value of their conscious introspections (that is, their thoughts, motives, intentions, etc.), which people consult to assess their own bias.
Because bias tends to occur non-consciously, searching for it in one’s explicit thoughts is a little like looking for one’s car in the refrigerator. In assessing other peoples’ bias, however, we tend to look at their behavior. Although people’s actions are certainly an imperfect indicator of non-conscious bias, it’s about as close as one is generally going to get to peering into the garage.
Consider the following example. One of your colleagues just hired his old college buddy for the new associate position. In judging whether his decision was biased, your colleague might recall his conscious efforts to be objective in reviewing the pool of applicants, or he might simply recall not feeling any signs of bias clouding his assessments. From his perspective, he hired someone he liked and felt was best qualified for the position. While his introspections thus leave him confident that his selection was objectively made, his behavior would likely leave you confident in that it wasn’t.
One reason for those contrasting conclusions is that people have access to their own thoughts, feelings, and intentions, but are in the dark when it comes to those of others. Our own thoughts come to us as a bright beacon while those of others are a black box. A more subtle distinction, though, motivated our most recent research. We wondered if people’s focus on others’ actions, rather than their inner thoughts and intentions, reflected more than just a lack of introspective illumination. More specifically, we wanted to examine if it also reflected an asymmetry in people’s beliefs about the relative credibility of their own thoughts and intentions versus those of others.
In terms of the college-buddy example, that is, we predicted that knowing your coworker’s intention to be fair and his faith in the talents of an old pal (and knowing his absence of any willful bias) would not radically change your (or other onlookers’) perceptions of his bias. And that is what we found: people’s lack of reliance on others’ introspections is in part due to a diminished valuation of those introspections.
In one experiment, we had subjects take a purported social intelligence test, gave them negative feedback about their score, and then asked them to evaluate the quality of the test after first thinking aloud about it. Later, when those subjects were asked whether they had been biased in their evaluation of the test, they ignored their evaluation of the test and instead focused on their thoughts and motives. Those thoughts and motives yielded no signs of bias, and therefore the subjects assumed that their evaluations were objective. A separate group of subjects in that experiment did not take the test but instead observed a peer take it and evaluate its validity. Those observer subjects tended to take a different approach to assessing the presence of bias. Although they heard the test-takers’ thoughts, they ignored them. Instead, they looked to the test-takers’ behavior and, in particular, to whether the test takers claimed that the test was invalid right after performing poorly on it. Thus, they attended to a peer’s actions for assessing that peer’s bias, while the subjects themselves relied on internal information to assess their own bias.
In another experiment, we asked participants to report their susceptibility to various classic biases that distort human judgment. These included the self-serving bias (taking too much credit for successes and too little credit for failures), the fundamental attribution error (mistakenly viewing people’s outcomes as a function of their personality rather than their circumstances), and the halo effect (viewing people as positive on numerous dimensions when they are known to be positive on one). Those subjects showed the usual tendency to view themselves as less biased than their peers. When asked how they made their judgment, they reported considering their own intentions, but their peers’ behavior. In a follow-up experiment, we asked them how valuable they felt introspective information versus behavioral information would be in making judgments about bias. They reported that introspective information would be valuable in assessing their own bias, but believed that it would not be as valuable for other people to use introspective information when assessing their bias; rather, they suggested, others should look to behavior.
Inspired by those findings, we conducted another experiment to test a possible strategy for reducing people’s bias blind spot. That strategy involved teaching people that their judgments can be affected by processes (such as biases) that operate unconsciously. Such education, we reasoned, could help people to recognize their susceptibility to bias by preventing them from relying on introspective evidence of it. In our experiment, some subjects read a fabricated article informing them about the role of nonconscious processes in judgment, and about people’s lack of awareness when they are influenced by unconscious processes. Other subjects were in a control condition and did not read that article. Both groups read a filler article masking our true interests. Then, in an ostensibly separate experiment, participants were asked to indicate their personal susceptibility relative to their student peers to a variety of different judgmental biases. The result was that participants who had been educated about nonconscious processes (and the perils of relying on introspection) saw themselves as no more objective than their peers, unlike those in the control condition. The two conditions differed significantly from each other, indicating that the intervention reduced the bias blind spot.
People’s willingness to recognize their own biases is, of course, an important first step in prompting them to correct for and overcome those influences. Once people are able to recognize that they can be biased without knowing it, perhaps they can stop relying on their good intentions and introspectively clean consciences for evidence of their own freedom from biases that range from corrupt, to discriminatory, to unfairly conflictual behavior. From that more humble starting point, they may be more open to engaging in efforts to rid themselves of their own biases and to understanding how others can be biased without knowing it. Such efforts are not just scientifically sensible, they are socially wise.
* * *
For a sampling of scholarship related to topics discussed in this post, see the following:
- Dana, J., & Loewenstein, G. (2003). A social science perspective on gifts to physicians from industry. Journal of the American Medical Association, 290, 252-255.
- Moore, D. A., Tetlock, P. E., Tanlu, L., & Bazerman, M. H. (2006). Conflicts of interest and the case of auditor independence. Academy of Management Review, 31, 10-29.
- Pronin, E. (2007). Perception and misperception of bias in human judgment. Trends in Cognitive Sciences, 11, 37-43.
- Uhlmann, E. L., & Cohen, G. L. (2005). Constructed criteria: Redefining merit to justify discrimination. Psychological Science, 16, 474-480.
For some previous Situationist posts on related topics, see “Mistakes Were Made (but not be me),” “Lima Beans–Yuch! (Why Wanting Not To Be Prejudiced May Not Be Enough),” “Self-Serving Biases,” “Captured Science,” “Unlevel Playing Fields: From Baseball Diamonds to Emergency Rooms,” “The Situation of Judging,” “Industry-Funded Research – Part I,” and “Industry-Funded Research – Part II.”
This entry was posted on March 6, 2008 at 12:27 am and is filed under Conflict, Ideology, Implicit Associations, Social Psychology. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.