From the APS Monitor (excerpts from a terrific primer on “The Mechanics of Choice”):
* * *
The prediction of social behavior significantly involves the way people make decisions about resources and wealth, so the science of decision making historically was the province of economists. And the basic assumption of economists was always that, when it comes to money, people are essentially rational. It was largely inconceivable that people would make decisions that go against their own interests. Although successive refinements of expected-utility theory made room for individual differences in how probabilities were estimated, the on-the-surface irrational economic behavior of groups and individuals could always be forced to fit some rigid, rational calculation.The problem is — and everything from fluctuations in the stock market to decisions between saving for retirement or purchasing a lottery ticket or a shirt on the sale rack shows it — people just aren’t rational. They systematically make choices that go against what an economist would predict or advocate.Enter a pair of psychological scientists — Daniel Kahneman (currently a professor emeritus at Princeton) and Amos Tversky — who in the 1970s turned the economists’ rational theories on their heads. Kahneman and Tversky’s research on heuristics and biases and their Nobel Prize winning contribution, prospect theory, poured real, irrational, only-human behavior into the calculations, enabling much more powerful prediction of how individuals really choose between risky options.
* * *
Univ. of Toronto psychologist Keith E. Stanovich and James Madison Univ. psychologist Richard F. West refer to these experiential and analytical modes as “System 1” and “System 2,” respectively. Both systems may be involved in making any particular choice — the second system may monitor the quality of the snap, System-1 judgment and adjust a decision accordingly.7 But System 1 will win out when the decider is under time pressure or when his or her System-2 processes are already taxed.
This is not to entirely disparage System-1 thinking, however. Rules of thumb are handy, after all, and for experts in high-stakes domains, it may be the quicker form of risk processing that leads to better real-world choices. In a study by Cornell University psychologist Valerie Reyna and Mayo Clinic physician Farrell J. Lloyd, expert cardiologists took less relevant information into account than younger doctors and medical students did when making decisions to admit or not admit patients with chest pain to the hospital. Experts also tended to process that information in an all-or-none fashion (a patient was either at risk of a heart attack or not) rather than expending time and effort dealing with shades of gray. In other words, the more expertise a doctor has, the more that his or her intuitive sense of the gist of a situation was used as a guide.8
In Reyna’s variant of the dual-system account, fuzzy-trace theory, the quick-decision system focuses on the gist or overall meaning of a problem instead of rationally deliberating on facts and odds of alternative outcomes.9 Because it relies on the late-developing ventromedial and dorsolateral parts of the frontal lobe, this intuitive (but informed) system is the more mature of the two systems used to make decisions involving risks.
A 2004 study by Vassar biopsychologist Abigail A. Baird and Univ. of Waterloo cognitive psychologist Jonathan A. Fugelsang showed that this gist-based system matures later than do other systems. People of different ages were asked to respond quickly to easy, risk-related questions such as “Is it a good idea to set your hair on fire?”, “Is it a good idea to drink Drano?”, and “Is it a good idea to swim with sharks?” They found that young people took about a sixth of a second longer than adults to arrive at the obvious answers (it’s “no” in all three cases, in case you were having trouble deciding).10 The fact that our gist-processing centers don’t fully mature until the 20s in most people may help explain the poor, risky choices younger, less experienced decision makers commonly make.
Adolescents decide to drive fast, have unprotected sex, use drugs, drink, or smoke not simply on impulse but also because their young brains get bogged down in calculating odds. Youth are bombarded by warning statistics intended to set them straight, yet risks of undesirable outcomes from risky activities remain objectively small — smaller than teens may have initially estimated, even — and this may actually encourage young people to take those risks rather than avoid them. Adults, in contrast, make their choices more like expert doctors: going with their guts and making an immediate black/white judgment. They just say no to risky activities because, however objectively unlikely the risks are, there’s too much at stake to warrant even considering them.11
Making Better Choices
The gist of the matter is, though, that none of us, no matter how grown up our frontal lobes, make optimal decisions; if we did, the world would be a better place. So the future of decision science is to take what we’ve learned about heuristics, biases, and System-1 versus System-2 thinking and apply it to the problem of actually improving people’s real-world choices.
One obvious approach is to get people to increase their use of System 2 to temper their emotional, snap judgments. Giving people more time to make decisions and reducing taxing demands on deliberative processing are obvious ways of bringing System 2 more into the act. Katherine L. Milkman (U. Penn.), Dolly Chugh (NYU), and Max H. Bazerman (Harvard) identify several other ways of facilitating System-2 thinking.12 One example is encouraging decision makers to replace their intuitions with formal analysis — taking into account data on all known variables, providing weights to variables, and quantifying the different choices. This method has been shown to significantly improve decisions in contexts like school admissions and hiring.
Having decision makers take an outsider’s perspective on a decision can reduce overconfidence in their knowledge, in their odds of success, and in their time to complete tasks. Encouraging decision makers to consider the opposite of their preferred choice can reduce judgment errors and biases, as can training them in statistical reasoning. Considering multiple options simultaneously rather than separately can optimize outcomes and increase an individual’s willpower in carrying out a choice. Analogical reasoning can reduce System-1 errors by highlighting how a particular task shares underlying principles with another unrelated one, thereby helping people to see past distracting surface details to more fully understand a problem. And decision making by committee rather than individually can improve decisions in group contexts, as can making individuals more accountable for their decisions.13
In some domains, however, a better approach may be to work with, rather than against, our tendency to make decisions based on visceral reactions. In the health arena, this may involve appealing to people’s gist-based thinking. Doctors and the media bombard health consumers with numerical facts and data, yet according to Reyna, patients — like teenagers — tend initially to overestimate their risks; when they learn their risk for a particular disease is actually objectively lower than they thought, they become more complacent — for instance by forgoing screening. Instead, communicating the gist, “You’re at (some) risk, you should get screened because it detects disease early” may be a more powerful motivator to make the right decision than the raw numbers. And when statistics are presented, doing so in easy-to-grasp graphic formats rather than numerically can help patients (as well as physicians, who can be as statistically challenged as most laypeople) extract their own gists from the facts.14
Complacency is a problem when decisions involve issues that feel more remote from our daily lives — problems like global warming. The biggest obstacle to changing people’s individual behavior and collectively changing environmental policy, according to Columbia University decision scientist Elke Weber, is that people just aren’t scared of climate change. Being bombarded by facts and data about perils to come is not the same as having it affect us directly and immediately; in the absence of direct personal experience, our visceral decision system does not kick in to spur us to make better environmental choices such as buying more fuel-efficient vehicles.15
How should scientists and policymakers make climate change more immediate to people? Partly, it involves shifting from facts and data to experiential button-pressing. Powerful images of global warming and its effects can help. Unfortunately, according to research conducted by Yale environmental scientist Anthony A. Leisurowitz, the dominant images of global warming in Americans’ current consciousness are of melting ice and effects on nonhuman nature, not consequences that hit closer to home; as a result, people still think of global warming as only a moderate concern.16
Reframing options in terms that connect tangibly with people’s more immediate priorities, such as the social rules and norms they want to follow, is a way to encourage environmentally sound choices even in the absence of fear.17 For example, a study by Noah J. Goldstein (Univ. of Chicago), Robert B. Cialdini (Arizona State), and Vladas Griskevicius (Univ. of Minnesota) compared the effectiveness of different types of messages in getting hotel guests to reuse their towels rather than send them to the laundry. Messages framed in terms of social norms — “the majority of guests in this room reuse their towels” — were more effective than messages simply emphasizing the environmental benefits of reuse.18
Yet another approach to getting us to make the most beneficial decisions is to appeal to our natural laziness. If there is a default option, most people will accept it because it is easiest to do so — and because they may assume that the default is the best. University of Chicago economist Richard H. Thaler suggests using policy changes to shift default choices in areas like retirement planning. Because it is expressed as normal, most people begin claiming their Social Security benefits as soon as they are eligible, in their early to mid 60s — a symbolic retirement age but not the age at which most people these days are actually retiring. Moving up the “normal” retirement age to 70 — a higher anchor — would encourage people to let their money grow longer untouched.19
Making Decisions About the Environment
APS Fellow Elke Weber recently had the opportunity to discuss her research with others who share her concern about climate change, including scientists, activists, and the Dalai Lama. Weber . . . shared her research on why people fail to act on environmental problems. According to her, both cognitive and emotional barriers prevent us from acting on environmental problems. Cognitively, for example, a person’s attention is naturally focused on the present to allow for their immediate survival in dangerous surroundings. This present-focused attitude can discourage someone from taking action on long-term challenges such as climate change. Similarly, emotions such as fear can motivate people to act, but fear is more effective for responding to immediate threats. In spite of these challenges, Weber said that there are ways to encourage people to change their behavior. Because people often fail to act when they feel powerless, it’s important to share good as well as bad environmental news and to set measurable goals for the public to pursue. Also, said Weber, simply portraying reduced consumption as a gain rather than a loss in pleasure could inspire people to act.
References and Further Reading:
- 7. Stanovich, K.E., & West, R.F. (2000). Individual differences in reasoning: Implications for the rationality debate.
- Behavioral & Brain Sciences, 23, 645–665.
- 8. Reyna, V.F., & Lloyd, F. (2006). Physician decision making and cardiac risk: Effects of knowledge, risk perception, risk
- tolerance, and fuzzy processing. Journal of Experimental Psychology: Applied, 12, 179–195.
- 9. Reyna, V.F. (2004). How people make decisions that involve risk: A dual-processes approach. Current Directions in
- Psychological Science, 13, 60–66.
- 10. Baird, A.A., & Fugelsang, J.A. (2004). The emergence of consequential thought: Evidence from neuroscience.
- Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences, 359, 1797–1804.
- 11. Reyna, VF., & Farley, F. (2006). Risk and rationality in adolescent decision making. Psychological Science in the Public
- Interest, 7, 1–44.
- 12. Milkman, K.L., Chugh, D., & Bazerman, M.H. (2009). How can decision making be improved? Perspectives on
- Psychological Science, 4, 379–383.
- 13. Ibid.
- 14. See Wargo, E. (2007). More than just the facts: Helping patients make informed choices. Cornell University Department
- of Human Development: Outreach & Extension. Downloaded from http://www.human.cornell.edu/hd/outreach-extension/loader.cfm?csModule=security/getfile&PageID=43508
- 15. Weber, E.U. (2006). Experience-based and description-based perceptions of long-term risk: Why global warming does
- not scare us (yet). Climatic Change, 77, 103–120.
- 16. Leisurowitz, A. (2006). Climate change risk perception and policy preferences: The role of affect, imagery, and values.
- Climatic Change, 77, 45–72.
- 17. Weber, E.U. (2010). What shapes perceptions of climate change? Wiley Interdisciplinary Reviews: Climate Change, 1,
- 18. Goldstein, N.J., Cialdini, R.B., & Griskevicius, V. (2008). A room with a viewpoint: Using social norms to motivate
- environmental conservation in hotels. Journal of Consumer Research, 35. Downloaded from http://www.csom.umn.edu/assets/118359.pdf
- 19. Thaler, R.H. (2011, July 16). Getting the Most Out of Social Security. The New York Times. Downloaded from
Related Situationist posts:
- The Situation of “Opting Out”
- Dan Kahneman on Fast and Slow Thinking
- Dan Kahneman on the Situation of Experience and Memory
- Dan Kahneman on the Situation of Well-Being,
- Dan Kahneman on the Situation of Intuition,
- Dan Kahneman’s Situation,
- The Situation of Financial Risk-Taking,
- Some (Interior) Situational Sources War – Part I,
- Some (Interior) Situational Sources War – Part II.
- Nicole Stephens on “Choice, Social Class, and Agency”
- Sheena Iyengar on the Art of Choosing
- Just Choose It!
- Sheena Iyengar on the Situation of Choice,
- The Blame Frame – Abstract,
- Sheena Iyengar’s Situation and the Situation of Choosing,
- Sheena Iyengar on ‘The Multiple Choice Problem,’
- ‘Situation’ Trumps ‘Disposition’- Part II
- The Cause of Rioting? That’s Easy: Rioters!
You can review hundreds of Situationist posts related to the topic of “choice myth” here.