The Situationist

Posts Tagged ‘cognitive dissonance’

The Situation of Poor Choices

Posted by The Situationist Staff on December 5, 2012

Social psychologist and Situationist friend Dave Nussbaum has another outstanding situationist post over on Random Assignments.  Here’s how it starts.

One of the obstacles that keeps the poor from rising out of poverty is the tendency to make costly financial decisions – like buying lottery tickets, taking out high interest loans (PDF), and failing to enroll in assistance programs – that only make their situation worse. In the past, these poor decisions have been attributed either to low income individuals’ personalities or issues in their environment, such as poor education or substandard living conditions. New research published this month in Science by Booth Assistant Professor of Behavioral Science Anuj Shah points to a new answer: living with scarcity changes people’s psychology.

The basic idea is that when resources are scarce – when people are short on time, or money, or food – each decision about how best to use those resources takes on greater urgency than when resources are abundant. This focus can have positive effects in the short term, but it comes at the expense of neglecting other, less urgent demands. For example, when they are under the press of urgent expenses like rent and groceries, people may neglect to do routine maintenance on their car and end up with costly (and avoidable) repairs down the road.

Shah, along with colleagues Sendhil Mullainathan of Harvard and Eldar Shafir of Princeton, published five studies in which he studied the effects of scarcity on decision making in various games in which people were paid according to their performance. In each of the studies some people received ample resources with which to play, while others received very few. Moreover, in some studies the players had the opportunity to borrow additional resources with interest. The researchers then observed how scarcity affected the players’ borrowing behavior, their performance, and the psychological processes at play.

Across the studies Shah found that for people who had very few resources, the games took on more urgency. They became more focused on the task at hand in order to make the best use of their scarce resources, but that this added focus came at a price, including mental fatigue, costly borrowing decisions, and poor overall performance.

For example, in an Angry Birds-type of game, in which the object was to knock down as many targets as possible, players who could take only three shots per round spent more time aiming each shot than players who had fifteen shots. This added focus improved performance, but it had downsides. When players were given the opportunity to “borrow” a shot, by giving up two shots in a later round of the game, players who had fewer in shots made counterproductive borrowing decisions that hurt their overall performance.

Read the rest of Dave’s post, discussing possible implications of the research, here.

Related Situationist posts:

Posted in Blogroll, Choice Myth, Distribution, Marketing, Social Psychology | Tagged: , , | 1 Comment »

The Situation of Success

Posted by The Situationist Staff on July 20, 2012

Dave Nussbaum has an excellent new post over on Random Assignments.  Here’s how it starts.

I don’t think Michael Lewis was trying to make a political point when he gave the commencement address at Princeton University last month (watch the whole thing here). Lewis, the author of several bestselling books including MoneyballLiar’s Poker, and The Big Short, knows a thing or two about the interdependence of luck and success and he was sharing his thoughts on the matter with the about-to-be Princeton graduates. Here’s a taste of what he told them:

Life’s outcomes, while not entirely random, have a huge amount of luck baked into them. Above all, recognize that if you have had success, you have also had luck — and with luck comes obligation. You owe a debt, and not just to your Gods. You owe a debt to the unlucky. I make this point because — along with this speech — it is something that will be easy for you to forget.

He’s right about that last point; it is easy to forget. It’s also convenient, Lewis told Jeffrey Brown in a follow-up interview on PBS’ NewsHour. Most people would acknowledge that both luck and merit are important ingredients to success. It’s just that people often like to feel like they are the authors of their accomplishments and ignore everything and everyone else who played a role. “As they age, and succeed,” Lewis told the graduates, “people feel their success was somehow inevitable.”

Now Lewis isn’t trying to deny Princeton graduates (or anyone else) credit for their success. He just wants them to take a minute to “dwell on just how fortunate they are.” His hope is simply that they have some compassion for people who worked just as hard they did but were less fortunate. As it turns out, there’s some research that suggests that taking a minute to dwell on your good fortune might have exactly that effect.

Way over on the other side of the country, on the campus of another elite university, Chris Bryan and his colleagues (PDF) asked Stanford University students to take a minute (or ten) to tell the story of how they got into the prestigious college. Not all the students got the same instructions, though. Half of the students were asked to focus on the role that “hard work, self-discipline and wise decisions played in helping you get here.” The other half were told to focus on the role of “chance, opportunity and help from others.” Neither group had any difficulty writing the essay. As Bryan, who will be joining the faculty at UC San Diego this fall, explained to me in an email:

People writing about merit would tell the story most successful people probably tell themselves by default–reminiscing about the long hours they spent studying, the times they made the “tough choice” they knew to be right, or how they skipped nights out with friends to stay home and work on an important paper. In some ways, the most interesting thing was that most people who got the good fortune instructions had no trouble acknowledging the lucky breaks they had gotten. Many said things like “I definitely worked hard to get where I am but I realize how fortunate I was to be born into a family that could afford to give me the support and resources I needed to succeed.”

So it seems that people are capable of seeing the role of luck and merit in contributing to their success. What Lewis might be particularly pleased to see, though, is how dwelling on luck, and the help they’d received from others, changed people’s attitudes. Compared to the students who wrote about their own merit, students who wrote about the role of good fortune in their success were, on average, more strongly in favor of policies like universal healthcare and access to unemployment benefits, which presumably helps with one’s obligation to the less fortunate. In addition to increasing support for liberal policies, thinking about one’s luck decreased support for conservative policies like building more prisons and instituting a flat tax. As Bryan explained to me, “it’s not that people’s ideology doesn’t matter, it’s just that their views on important issues can move around significantly depending on how they think about their own success. When they’re focused on their own talent and effort, they’re much less willing to contribute to the common good than when they pause to recognize that luck and help from other people played a big part in their ability to succeed.”

Read the rest of the post, which examines the relevance of Lewis’s remarks and Bryan’s research for politics, here.

Related Situationist posts:

Posted in Blogroll, Ideology, Politics, Social Psychology | Tagged: , , | 4 Comments »

Random Assignments

Posted by The Situationist Staff on May 11, 2012

Social Psychologist Dave Nussbaum recently launched his blog, Random Assignments.

The blog already contains several posts worth reading, including a series on the important topic of replication in social science.  The first two parts are “Replicating Dissonance” and “Conceptual Replication.”

Here’s a sample of Nussbaum’s writing:

The 1950s were a bleak time if you were a social psychologist interested in the empirical study of thoughts and feelings and how they affect human behavior. At that time, experimental psychology was dominated by behaviorism, an approach which focused exclusively on observable behavior, exiling ephemeral concepts like beliefs and emotions outside the boundaries of proper science. But things were about to change.

The Theory of Cognitive Dissonance, published by Leon Festinger in 1957, was one of those things. The theory was based on the simple idea that when a person simultaneously holds two conflicting beliefs he will experience a feeling of discomfort – cognitive dissonance – and that he will be motivated to end that discomfort by reducing the conflict between the beliefs, often by changing one of them.

Today, the term cognitive dissonance has entered our vernacular and the idea that we change or discard beliefs that don’t suit us seems like common sense. Research on how people rationalize their beliefs has spread to political science, medicine, neuroscience, and the law, and is one of the cornerstones of our understanding of human psychology. But in 1957, at a time when the field of psychology was dominated by behaviorism, the notion was far more controversial. Luckily, Leon Festinger and his colleagues and students conducted numerous experiments that tested predictions derived from Cognitive Dissonance Theory that could not be accounted for by behaviorist principles.

One of my favorite of these experiments (PDF), published by Elliot Aronson and Judson Mills in 1959, had college women reading obscene words out loud (words so obscene that I don’t feel comfortable writing them here myself, but the F word is in there, as is a four-letter word that also means rooster, and remember, this was 1959!). The women were reading these words as an initiation to get into a discussion group about the psychology of sex – they had to prove they were not going to be too embarrassed to take part in the conversation. This was the “severe initiation” condition. Another group of women recited a milder list of words (e.g., prostitute, virgin); this was the “mild initiation” condition. The women then heard a recording of a discussion by the group to which they had gained entry – as it turned out, the discussion was, according to the study’s authors, “one of the most worthless and uninteresting discussions imaginable.”  The question was which group of women would like the psychology of sex discussion group more, the ones who had to undergo the severe initiation or the mild one?

To find out, pay a visit to Nussbaum’s blog.

Related Situationist posts:

Posted in Blogroll, Classic Experiments, Social Psychology | Tagged: , , | Leave a Comment »

Carol Tavris Interview – Podcast

Posted by The Situationist Staff on April 10, 2012

From (For Good Reason):

Carol Tavris describes dissonance theory and how self-justification and self-deception often keep people from changing their minds even in the light of compelling contrary evidence, because the evidence is often dissonant with one’s self-image. She details the implications of dissonance theory for the persistence of psychic charlatans and other peddlers of the paranormal, and how it may explain how someone like Sylvia Brown can live with herself, and also how it may explain how believers remain so gullible about such unsupportable claims. She describes confirmation bias as a component of dissonance theory. She talks about how dissonance theory applies to the skeptic movement, both in terms of suggesting the best strategies for engaging the credulous, and in terms of fostering skepticism about one’s own skeptical views. And she argues that skepticism should be affirmative rather than destructive in its approach, and focused on both critical thinking and creative thinking alike.

Related Situationist posts:

Posted in Book, Ideology, Illusions, Marketing, Podcasts, Social Psychology | Tagged: , , , | Comments Off on Carol Tavris Interview – Podcast

Situationist Political Science and the Situation of Voters

Posted by The Situationist Staff on July 14, 2010

Joe Keohane wrote an outstanding article, “How Facts Backfire: Researchers discover a surprising threat to democracy: our brains,” for the Boston Globe last week.  Here are some excerpts.

* * *

It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. . . . Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.

Area Man Passionate Defender Of What He Imagines Constitution To Be,” read a recent Onion headline. Like the best satire, this nasty little gem elicits a laugh, which is then promptly muffled by the queasy feeling of recognition. The last five decades of political science have definitively established that most modern-day Americans lack even a basic understanding of how their country works. In 1996, Princeton University’s Larry M. Bartels argued, “the political ignorance of the American voter is one of the best documented data in political science.”

On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

What’s going on? How can we have things so wrong, and be so sure that we’re right?

* * *

To read the rest of the article, including Keohane‘s answers to those questions, click here.

For a sample of related Situationist posts, see “The Situation of Presidential Death Threats,” Voting for a Face,” “The Situation of Swift-Boating,”Implicit Associations in the 2008 Presidential Election,” “The Situation of Political Animals,” “Your Brain on Politics.” The Situation of the Obama Presidency and Race Perceptions,” Racial Attitudes in the Presidential Race,” The Racial Situation of Voting,” The Interior Situation of Undecided Voters,” On Being a Mindful Voter,” “Implicit Associations in the 2008 Presidential Election,” and “What does an Obama victory mean?

Posted in Choice Myth, Conflict, Cultural Cognition, Deep Capture, Education, Ideology, Naive Cynicism, Politics | Tagged: , , , | Leave a Comment »

Market Manipulation – Assuaging Cognitive Dissonance

Posted by The Situationist Staff on March 15, 2009

luckies-20679-doctorsFrom Wikipedia:

Cognitive dissonance is an uncomfortable feeling caused by holding two contradictory ideas simultaneously. The “ideas” or “cognitions” in question may include attitudes and beliefs, and also the awareness of one’s behavior. The theory of cognitive dissonance proposes that people have a motivational drive to reduce dissonance by changing their attitudes, beliefs, and behaviors, or by justifying or rationalizing their attitudes, beliefs, and behaviors. Cognitive dissonance theory is one of the most influential and extensively studied theories in social psychology.

Dissonance normally occurs when a person perceives a logical inconsistency among his or her cognitions. This happens when one idea implies the opposite of another. For example, a belief in animal rights could be interpreted as inconsistent with eating meat or wearing fur. Noticing the contradiction would lead to dissonance, which could be experienced as anxiety, guilt, shame, anger, embarrassment, stress, and other negative emotional states. When people’s ideas are consistent with each other, they are in a state of harmony, or consonance. If cognitions are unrelated, they are categorized as irrelevant to each other and do not lead to dissonance.

Cookie CrispA powerful cause of dissonance is when an idea conflicts with a fundamental element of the self-concept, such as “I am a good person” or “I made the right decision.” The anxiety that comes with the possibility of having made a bad decision can lead to rationalization, the tendency to create additional reasons or justifications to support one’s choices. A person who just spent too much money on a new car might decide that the new vehicle is much less likely to break down than his or her old car. This belief may or may not be true, but it would likely reduce dissonance and make the person feel better. Dissonance can also lead to confirmation bias, the denial of disconfirming evidence, and other ego defense mechanisms.

Smokers tend to experience cognitive dissonance because it is widely accepted that cigarettes cause lung cancer, yet virtually everyone wants to live a long and healthy life. In terms of the theory, the desire to live a long life is dissonant with the activity of doing something that will most likely shorten one’s life. The tension produced by these contradictory ideas can be reduced by quitting smoking, denying the evidence of lung cancer, or justifying one’s smoking. For example, smokers could rationalize their behavior by concluding that only a few smokers become ill, that it only happens to very heavy smokers, or that if smoking does not kill them, something else will.

* * *

* * *

This case of dissonance could also be interpreted in terms of a threat to the self-concept.  The thought, “I am increasing my risk of lung cancer” is dissonant with the self-related belief, “I am a smart, reasonable person who makes good decisions.” Because it is often easier to make excuses than it is to change behavior, dissonance theory leads to the conclusion that humans are rationalizing and not always rational beings.

* * *

Posted in Food and Drug Law, Marketing, Social Psychology, Video | Tagged: , , , | 1 Comment »

 
%d bloggers like this: