The Situationist

Archive for the ‘Social Psychology’ Category

Wegstock #6 – Kurt Gray

Posted by The Situationist Staff on August 6, 2013

In 2011, a conference honoring Dan Wegner, “Wegstock,” was held at Harvard University.

Speakers include Dan Gilbert, Susan Fiske, Tim Wilson, Jon Haidt, Henk Aarts, Nick Epley, Bill Swann, Todd Heatherton, Thalia Wheatley, Ap Dijksterhuis, Jon Krosnick, Jerry Clore, Bill Crano, Robin Vallacher, Jamie Pennebaker, Jonathan Schooler and Dan Wegner.

The talks are brief and are well worth watching.  We will highlight the individual talks, roughly 15 minutes each, over the next month.

In this video, Situationist friend Kurt Gray discusses his research and how Dan Wegner helped to shape it.

To review a collection of Situationist posts discussing Dan Wegner’s research, click here. To review a collection of posts regarding Kurt Gray’s work, click here.

Posted in Social Psychology, Video | Leave a Comment »

Wegstock #5 – Henk Aarts

Posted by The Situationist Staff on August 2, 2013

In 2011, a conference honoring Dan Wegner, “Wegstock,” was held at Harvard University.

Speakers include Dan Gilbert, Susan Fiske, Tim Wilson, Jon Haidt, Henk Aarts, Nick Epley, Bill Swann, Todd Heatherton, Thalia Wheatley, Ap Dijksterhuis, Jon Krosnick, Jerry Clore, Bill Crano, Robin Vallacher, Jamie Pennebaker, Jonathan Schooler and Dan Wegner.

The talks are brief and are well worth watching.  We will highlight the individual talks, roughly 15 minutes each, over the next month.

In this video, Henk Aarts describes a white dog, a red car, his research and how Dan Wegner helped to shape it.

To review a collection of Situationist posts discussing Dan Wegner’s research, click here.

Posted in Social Psychology, Video | Leave a Comment »

Wegstock #4 – Tim Wilson

Posted by The Situationist Staff on July 29, 2013

In 2011, a conference honoring Dan Wegner, “Wegstock,” was held at Harvard University.

Speakers include Dan Gilbert, Susan Fiske, Tim Wilson, Jon Haidt, Henk Aarts, Nick Epley, Bill Swann, Todd Heatherton, Thalia Wheatley, Ap Dijksterhuis, Jon Krosnick, Jerry Clore, Bill Crano, Robin Vallacher, Jamie Pennebaker, Jonathan Schooler and Dan Wegner.

The talks are brief and are well worth watching.  We will highlight the individual talks, roughly 15 minutes each, over the next month.

In this video, Situationist Contributor Timothy Wilson discusses the field and his research and a little bit about his book, Redirect.

To review a collection of Situationist posts discussing Dan Wegner’s research, click here. Click here for Situationist posts about Tim Wilson’s research.

Posted in Situationist Contributors, Social Psychology, Video | Leave a Comment »

Wegstock #3 – John Haidt

Posted by The Situationist Staff on July 26, 2013

In 2011, a conference honoring Dan Wegner, “Wegstock,” was held at Harvard University.

Speakers include Dan Gilbert, Susan Fiske, Tim Wilson, Jon Haidt, Henk Aarts, Nick Epley, Bill Swann, Todd Heatherton, Thalia Wheatley, Ap Dijksterhuis, Jon Krosnick, Jerry Clore, Bill Crano, Robin Vallacher, Jamie Pennebaker, Jonathan Schooler and Dan Wegner.

The talks are brief and are well worth watching.  We will highlight the individual talks, roughly 15 minutes each, over the next month.

In this video, John Haidt describes (quite hilariously at times) his research and how Dan Wegner helped to shape it.

To review a collection of Situationist posts discussing Dan Wegner’s research, click here. Click here for Situationist posts about Jon Haidt’s research.

Posted in Social Psychology, Video | Leave a Comment »

Wegstock #2 – Susan Fiske

Posted by The Situationist Staff on July 24, 2013

In 2011, a conference honoring Dan Wegner, “Wegstock,” was held at Harvard University.

Speakers include Dan Gilbert, Susan Fiske, Tim Wilson, Jon Haidt, Henk Aarts, Nick Epley, Bill Swann, Todd Heatherton, Thalia Wheatley, Ap Dijksterhuis, Jon Krosnick, Jerry Clore, Bill Crano, Robin Vallacher, Jamie Pennebaker, Jonathan Schooler and Dan Wegner.

The talks are brief and are well worth watching.  We will highlight the individual talks, roughly 15 minutes each, over the next month.

In this video, Situationist friend Susan Fiske describes aspects of her scholarship and how Dan Wegner inspired them.

To review a collection of Situationist posts discussing Dan Wegner’s research, click here.

Posted in Social Psychology, Video | Leave a Comment »

Wegstock #1 – Dan Gilbert’s Opening Remarks

Posted by The Situationist Staff on July 21, 2013

In 2011, a conference honoring Dan Wegner, “Wegstock,” was held at Harvard University.

Speakers include Dan Gilbert, Susan Fiske, Tim Wilson, Jon Haidt, Henk Aarts, Nick Epley, Bill Swann, Todd Heatherton, Thalia Wheatley, Ap Dijksterhuis, Jon Krosnick, Jerry Clore, Bill Crano, Robin Vallacher, Jamie Pennebaker, Jonathan Schooler and Dan Wegner.

The talks are brief and are well worth watching.  We will highlight the individual talks, roughly 15 minutes each, over the next month.

In this video, Situationist friend Dan Gilbert opens the conference.

To review a collection of Situationist posts discussing Dan Wegner’s research, click here.

Posted in Social Psychology, Video | Leave a Comment »

The Situation of Secret Pleasures (more on Dan Wegner’s Work)

Posted by The Situationist Staff on July 18, 2013

telling secrets

This excerpt, which highlights some of the remarkable work by the late Dan Wegner, comes from an article written by Eric Jaffe in a 2006 edition of the APS’s Observer:

“Freud’s Fundamental Rule of Psychoanalysis was for patients to be completely open with a therapist no matter how silly or embarrassing the thought,” says Anita Kelly, a researcher at the University of Notre Dame who published one of the first books on the formal study of secrets, The Psychology of Secrets, in 2002.

Only since the late 1980s and early 1990s have researchers like Daniel Wegner and James Pennebaker put Freud through the empirical ringer and begun to understand the science behind secrets. “The Freudian way of thinking about things was, he assumed suppression took place and looked at what was happening afterwards,” says Wegner, a psychologist at Harvard. “The insight we had was, let’s not wait until after the fact and assume it occurred, let’s get people to try to do it and see what happened. That turned out to be useful insight; it opened this up to experimental research. It became a lab science instead of an after-the-fact interpretation of peoples’ lives.”

For Wegner, an interest in secrets began with a white bear. In Russian folklore attributed to Dostoevsky, Tolstoy, or sometimes both, a man tells his younger brother to sit in the corner and not think of a white bear, only to find later that the sibling can think of nothing else. If a meaningless white bear can arouse such frustration, imagine the crippling psychological effects of trying not to think of something with actual importance when the situation requires silence — running into the wife of a friend who has a mistress, being on a jury and having to disregard a stunning fact, or hiding homosexuality in a room full of whack-happy wiseguys.

So in 1987, Wegner, who at that time was at Trinity University, published a paper in the Journal of Personality and Social Psychology discussing what happens when research subjects confront the white bear in a laboratory. In the study, subjects entered a room alone with a tape recorder and reported everything that came to mind in a five-minute span. Before the experiment, Wegner told some subjects to think of anything except a white bear, and told others to try to think of a white bear. Afterwards, the subjects switched roles. Any time a subject mentioned, or merely thought of, a white bear, he or she had to ring a bell inside the room.

It was not quite Big Ben at noon, but those who suppressed the white bear rang the bell once a minute — more often than subjects who were free to express the thought. More remarkably, Wegner found what he called the “rebound effect”: When a subject was released from suppression and told to express a hidden thought, it poured out with greater frequency than if it had been mentionable from the start. (Think fresh gossip.) He also found evidence for an insight called “negative cuing.” The idea is that a person trying to ditch a thought will look around for something to displace it — first at the ceiling fan, then a candle, then a remote control. Soon the mind forms a latent bond between the unwanted thought and the surrounding items, so that everything now reminds the person of what he is trying to forget, exacerbating the original frustration.

“People will tend to misread the return of unwanted thoughts,” Wegner said recently. “We don’t realize that in keeping it secret we’ve created an obsession in a jar.” Wegner told the story of a suicidal student who once called him for help. Desperate to keep her on the phone, but lacking any clinical training, Wegner mentioned the white bear study. Slowly the student realized she had perpetuated a potentially fleeting thought by trying to avoid it. “She got so twisted up in the fact that she couldn’t stop thinking of killing herself, that she was making it come back to mind. She was misreading this as, there’s some part of me that wants to do it. What she really wanted was to get rid of the thought.”

One method of diverging attention from an unwanted thought, says Wegner, is to focus on a single distraction from the white bear, like a red Volkswagen, an idea that he tested successfully in later experiments. The concern with this technique, which Freud first laid out, is that a person could become obsessed with an arbitrary item, planting the seeds for abnormal behavior. In a later experiment, published in 1994 in the same journal, Wegner found more evidence that secrets lead to strange obsession. He placed four subjects who had never met around a table, split them into two male-female teams, and told them to play a card game. One team was instructed to play footsie without letting the other team know. At the end of the experiment, the secret footsie-players felt such a heightened attraction toward one another that the experimenters made them leave through separate doors, for ethical reasons. “We can end up being in a relationship we don’t want, or interested in things that aren’t at all important, because we had to keep them quiet,” Wegner said, “and it ends up growing.”

Live Free or Die

The logical opposite of an unhealthy obsession based on secrets is a healthy result from disclosing such secrets. This healing aspect of revelation is where Wegner’s work connects with James Pennebaker’s. In the late 1970s, Pennebaker was part of a research team that found, via survey, that people who had a traumatic sexual experience before age 17 were more likely to have health problems as they got older. Pennebaker looked further and found that the majority of these people had kept the trauma hidden, and in 1984 he began the first of many studies on the effects of revealing previously undisclosed secrets.

In most of Pennebaker’s experiments, subjects visited a lab for three or four consecutive days, each time writing about traumatic experiences for 15 or 20 minutes. In the first five years, hundreds of people poured their secrets onto the page. A college girl who knew her father was seeing his secretary; a concentration camp survivor who had seen babies tossed from a second-floor orphanage window; a Vietnam veteran who once shot a female fighter in the leg, had sex with her, then cut her throat. By the end of the experiment, many participants felt such intense release that their handwriting became freer and loopier. In one study of 50 students, those who revealed both a secret and their feelings visited the health center significantly fewer times in the ensuing six months than other students who had written about a generic topic, or those who had only revealed the secret and not the emotions surrounding it.

The work led to many papers showing evidence that divulging a secret, which can mean anything from telling someone to writing it on a piece of paper that is later burned, is correlated with tangible health improvements, both physical and mental. People hiding traumatic secrets showed more incidents of hypertension, influenza, even cancer, while those who wrote about their secrets showed, through blood tests, enhanced immune systems. In some cases, T-cell counts in AIDS patients increased. In another test, Pennebaker showed that writing about trauma actually unclogs the brain. Using an electroencephalogram, an instrument that measures brain waves through electrodes attached to the scalp, he found that the right and left brains communicated more frequently in subjects who disclosed traumas.

(It should be noted that the type of secrets discussed in this article are personal secrets—experiences a person chooses not to discuss with others. They can be positive, in the case of hiding a birthday cake, or negative, in the case of hiding a mistress. Secrets that could be considered “non-personal,” for example, information concealed as part of a job, were not specifically addressed.)

Exactly why revelation creates such health benefits is a complicated question. “Most people in psychology have been trained to think of a single, parsimonious explanation for an event,” said Pennebaker, who did much of his research at Southern Methodist University before coming to the University of Texas, where he is chair of the psychology department. “Well, welcome to the real world. There are multiple levels of explanation here.” Pennebaker lists a number of reasons for the health improvements. Writing about a secret helps label and organize it, which in turn helps understand features of the secret that had been ignored. Revelation can become habitual in a positive sense, making confrontation normal. Disclosure can reduce rumination and worry, freeing up the mental quagmires that hindered social relationships. People become better listeners. They even become better sleepers. “The fact is that all of us occasionally are dealing with experiences that are hard to talk about,” Pennebaker said. “Getting up and putting experiences into words has a powerful effect.”

At the end of a recent Sopranos episode, Vito looks most content after seeing a New Hampshire license plate, with its state motto: “Live free or die.” Pennebaker’s research may add a new level of truth to that phrase.

Little Machiavellis

In the early 1990s, it was not unusual for 3-year-old Jeremy Peskin to want a cookie. His mother, Joan, used to hide them in the high cupboards of their home in Toronto; when she left, Jeremy would climb up and sneak a few. One day, Jeremy had a problem: He wanted a cookie, but his mother was in the kitchen. “He said to me, ‘Go out of the kitchen, because I want to take a cookie,’ ” Joan recalled recently. Unfortunately for Jeremy, Joan Peskin was a doctorate student in psychology at the time, and smart enough to see through the ruse. Fortunately for developmental researchers, Peskin’s experience led her to study when children first develop the capacity for secrets.

What interested Peskin, now a professor at the University of Toronto, was Jeremy’s inability to separate his mother’s physical presence from her mental state. If she was out of the room, he would be able to take a cookie, whether or not his mother knew that he intended to take a cookie. Peskin took this insight to the laboratory — in this case, local day-care centers — where she tried to get children age three, four, and five to conceal a secret. She showed the children two types of stickers. The first, a gaudy, glittery sticker, aroused many a tiny smile; the second, a drab, beige sticker of an angel, was disliked. Then she introduced a mean puppet and explained that this puppet would take whatever sticker the children wanted most. When the puppet asked 4- and 5-year-olds which sticker they wanted, most of the children either lied or would not tell. The 3-year-olds almost always blurted out their preference, even when the scenario was repeated several times, she found in the study, which was published in Developmental Psychology in 1992. Often the 3-year-olds grabbed at the shiny sticker as the puppet took it away, showing a proper understanding of the situation but an inability to prevent it via secretive means.

The finding goes beyond secrets; 4 has become the age when psychologists think children develop the ability to understand distinct but related inner and outer worlds. “When I teach it I put a kid on the overhead with a thought bubble inside,” Peskin said. “When they could think of someone else’s mental state — say, ignorance, somebody not knowing something — that influences their social world.” In a follow-up study published in Social Development in 2003, Peskin found again that 3-year-olds were more likely than 4- or 5-year-olds to reveal the location of a surprise birthday cake to a hungry research confederate. “When a child is able to keep a secret,” Peskin says, “parents should take it as, that’s great, this is normal development. They aren’t going to be little Machiavellis. This is normal brain development.”

Confidence in Confidants

Soon after Mark Felt revealed himself as Deep Throat, the anonymous source who guided Bob Woodward during the Watergate scandal, Anita Kelly’s phone began to ring. “One morning I had 10 messages from different news groups,” she recalled recently. “They wanted me to say that secrecy’s a bad thing, and I’d say, look, there’s no evidence. This guy’s in his early 90s, and has seemed to have a healthy life.”

When preparing The Psychology of Secrets, Kelly re-examined the consequences and benefits of secret-keeping, and began to believe that while divulging secrets improves health, concealing them does not necessarily cause physical problems. “I couldn’t find any evidence that keeping a secret makes a person sick,” Kelly said. “There is evidence that by writing about held-back information someone will get health benefits. Someone keeping a secret would miss out on those benefits. It’s not the same as saying if you keep a secret you’re going to get sick.”

Her latest work, in press at the Journal of Personality, challenged the notion that secret-keeping can cause sickness. Instead of merely looking at instances of sickness nine weeks after disclosure, Kelly and co-author Jonathan Yip adjusted their measurements for initial levels of health. They found, quite simply, that secretive people also tend to be sick people, both now and two months down the line.

“It doesn’t look like the process of keeping the secret made them sick,” she said. High “self-concealers,” as Kelly calls them, tend to be more depressed, anxious, and shy, and have more aches and pains by nature, perhaps suggesting some natural link between being secretive and being vulnerable to illness. “I don’t think it’s much of a stretch to say that being secretive could be linked to being symptomatic at a biological level.”

This conclusion came gradually. In the mid-1990s, following Pennebaker’s line of research that had really opened up the field, Kelly focused on the health effects of revealing and concealing secrets. The research clearly showed links between secrets and illness. In a review of the field for Current Directions in Psychological Science in 1999, Kelly notes some of these health correlations: cases in which breast cancer patients who talked about their concealed emotions survived almost twice as long as those who did not; students who wrote about private traumatic events showed higher antibody levels four and six months after a Hepatitis B vaccination; and gay men who concealed their sexuality had a higher rate of cancer and infectious disease.

But in 1998 she did a study asking patients about their relationships with their therapists. She found that 40 percent of them were keeping a secret, but generally felt no stress as a result. Kelly began to believe that some secrets can be kept successfully, and that, in some scenarios, disclosing a secret could cause more problems than it solves. Psychologists, she felt, were not paying enough attention to the situations in which disclosure should occur — only that it did. “The essence of the problem with revealing personal information is that revealers may come to see themselves in undesirable ways if others know their stigmatizing secrets,” she wrote in the 1999 paper.

John Caughlin, a professor of communication at the University of Illinois at Urbana-Champaign who has studied secrets, agrees that sometimes openness is not the best policy. “People are so accustomed to saying an open relationship is a good one, that if they have secrets it can make them feel that something’s wrong,” he said recently. In 2005, Caughlin published a paper in Personal Relationships suggesting that people have a poor ability to forecast how they will feel after revealing a secret, and how another person will respond to hearing it. “I’m not touting that people should keep a lot of secrets,” he said, “but I don’t think people should assume it’s bad, and I think they do.” In her new book, Anatomy of a Secret Life, published in April, Gail Saltz, a professor of psychiatry at Cornell Medical School, referred to secrets as “benign” or “malignant,” depending on the scenario. “In teenagers, having secret identities is normal, healthy separation from parents and needs to go on,” said Saltz recently.

To address this concern, Kelly has focused her recent work on the role of confidants in the process of disclosure. She created a simple diagram advising self-concealers when they should, and when they should not, reveal a secret. On one hand, if the secret does not cause mental or physical stress, it should be kept, to provide a sense of personal boundary and avoid unnecessary social conflict. If it does cause anguish, the secret-keeper must then evaluate whether he or she has a worthy confidant, someone willing to work toward a cathartic insight. When such a confidant is not available, the person should write down his or her thoughts and feelings. “The world changes when you tell someone who knows all your friends,” said Kelly, who experienced this change firsthand 15 years back, when she shared with a colleague something “very personal and embarrassing,” as she called it, and then found her secret floating among her colleagues. “You have to think, what are the implications with my reputation,” she said. “It’s more complicated once you have to reveal to someone.”

To review a collection of Situationist posts discussing Dan Wegner’s research, click here.

Posted in Emotions, Life, Morality, Social Psychology | 1 Comment »

Dan Wegner

Posted by The Situationist Staff on July 10, 2013

Dan Wegner

From Harvard Gazette:

Daniel M. Wegner, a pioneering social psychologist who helped to reveal the mysteries of human experience through his work on thought suppression, conscious will, and mind perception, died July 5 as a result of amyotrophic lateral sclerosis (ALS). He was 65.

The John Lindsley Professor of Psychology in Memory of William James, Wegner redefined social psychology as the science of human experience. He was arguably most famous for his experiments on thought suppression, in which people were unable to keep from thinking of a white bear.

Wegner also broke ground in other areas of social psychology, including transactive memory (how memories are distributed across groups and relationship partners) and action identification (what people think they are doing). He had also explored the experience of conscious will, and most recently focused on mind perception (how people perceive human and nonhuman minds).

“Dan was, I believe, the most original thinker in modern psychology,” said Dan Gilbert, the Edgar Pierce Professor of Psychology, who knew Wegner for three decades. “Most of us work on problems that are important in our field, and we use theories others have invented to make progress. Dan didn’t make progress — Dan made new highways, new roads. He opened doors in walls that we didn’t know had doors in them, and he did this over and over.”

Gilbert said he was privileged to call Wegner one of his closest friends. The two met while they both worked in Texas — Gilbert at the University of Texas and Wegner at Trinity University.

“Being among the few social psychologists in Texas, we were introduced by a mutual friend, and it was love at first sight,” Gilbert said. “We’ve been true friends ever since.”  He added. “I’m heartbroken to lose my friend of 30 years, but I guess the only thing worse would have been not to have a friend of 30 years.”

While Wegner was known for his pioneering work on the mind, Gilbert said his intellectual curiosity seemed never to rest.

“The thing about Dan is he didn’t take the lab coat off,” Gilbert said. “For him, being a psychologist wasn’t a job, it was a way of being. He simply spent all his waking time thinking about the interesting aspects of the mind. It was 24/7 for him.”

That intellectual heft, however, never masked Wegner’s humor.

“Dan Wegner was the funniest human being I’ve ever known, and everybody else was a distant second,” Gilbert said. “To say someone was funny may sound frivolous, but I would make the claim that Dan understood something important, which is that humor is the place where intelligence and joy meet. Dan understood that … humor is where a brilliant mind tickles itself.”

That sense of humor, Gilbert said, often showed up in Wegner’s writing, and helped transform the way social psychology is described in many journals today.  “If you open a psychology journal now,” he said, “many, many people write in a Wegner-esque style.”

Even in his final days, Gilbert said, Wegner’s restless mind faced the challenge of his death with an inspirational degree of curiosity.

“It was a privilege to sit by his side as he took this journey to the end,” Gilbert said. “About a month ago, I asked him, ‘If you had to think of one word to describe this experience, what would it be?’ He looked at me, and he said ‘fascinating.’ He was a student of the human experience, and he was having an experience unlike most of us ever have. And rather than bemoaning it or crying about it, he took it as another fascinating thing to study and learn about and think about.”

Born in Calgary, Alberta, Canada, Wegner studied as an undergraduate and graduate student at Michigan State University, earning his Ph.D. in 1974. He was appointed an assistant professor and rose to full professor and chair of the psychology department at Trinity in San Antonio.

Wegner joined the faculty in the psychology department at the University of Virginia in 1990, where he was the William R. Kenan Jr. Professor of Psychology before joining the Harvard faculty in 2000.

Wegner was the author of four academic books, an introductory psychology textbook, and nearly 150 journal articles and book chapters.

Wegner’s research was funded by the National Science Foundation and the National Institute of Mental Health. In 1996-1997 he was a fellow at the Center for Advanced Study in the Behavioral Sciences, and in 2011 was inducted as a fellow of the American Academy of Arts and Sciences.  He received many of the top honors in his field, including the William James Fellow Award from the Association for Psychological Science, the Distinguished Scientific Contribution Award from the American Psychological Association, the Distinguished Scientist Award from the Society of Experimental Social Psychology, and the Donald T. Campbell Award from the Society for Personality and Social Psychology.

Wegner is survived by his wife of 29 years, Toni Giuliano Wegner of Winchester, and his daughters, Kelsey Wegner Hurlburt of Dunkirk, Md., and Haley Wegner of Winchester. At Wegner’s request, his body was donated to the Massachusetts General Hospital’s Neurological Clinical Research Institute for ALS Research.

A memorial service will be held at 4 p.m. on Saturday at the Winchester Unitarian Society, 478 Main St., Winchester, Mass. Wegner requested that his service be a celebration of life, and so would welcome Hawaiian shirts.

In lieu of flowers, donations can be made to:

Compassionate Care ALS
P.O. Box 1052
West Falmouth, Mass. 02574

Winchester Unitarian Society
478 Main Street
Winchester, Mass. 01890

To review a collection of Situationist posts discussing Dan Wegner’s research, click here.

Posted in Illusions, Life, Social Psychology | 1 Comment »

The Psychological Situation of Markets

Posted by The Situationist Staff on July 8, 2013

Caltec Image

From Caltec News (by Marcus Woo):

When it comes to economics versus psychology, score one for psychology.

Economists argue that markets usually reflect rational behavior—that is, the dominant players in a market, such as the hedge-fund managers who make billions of dollars’ worth of trades, almost always make well-informed and objective decisions. Psychologists, on the other hand, say that markets are not immune from human irrationality, whether that irrationality is due to optimism, fear, greed, or other forces.

Now, a new analysis published the week of July 1 in the online issue of the Proceedings of the National Academy of Sciences (PNAS) supports the latter case, showing that markets are indeed susceptible to psychological phenomena. “There’s this tug-of-war between economics and psychology, and in this round, psychology wins,” says Colin Camerer, the Robert Kirby Professor of Behavioral Economics at the California Institute of Technology (Caltech) and the corresponding author of the paper.

Indeed, it is difficult to claim that markets are immune to apparent irrationality in human behavior. “The recent financial crisis really has shaken a lot of people’s faith,” Camerer says. Despite the faith of many that markets would organize allocations of capital in ways that are efficient, he notes, the government still had to bail out banks, and millions of people lost their homes.

In their analysis, the researchers studied an effect called partition dependence, in which breaking down—or partitioning—the possible outcomes of an event in great detail makes people think that those outcomes are more likely to happen. The reason, psychologists say, is that providing specific scenarios makes them more explicit in people’s minds. “Whatever we’re thinking about, seems more likely,” Camerer explains.

For example, if you are asked to predict the next presidential election, you may say that a Democrat has a 50/50 chance of winning and a Republican has a 50/50 chance of winning. But if you are asked about the odds that a particular candidate from each party might win—for example, Hillary Clinton versus Chris Christie—you are likely to envision one of them in the White House, causing you to overestimate his or her odds.

The researchers looked for this bias in a variety of prediction markets, in which people bet on future events. In these markets, participants buy and sell claims on specific outcomes, and the prices of those claims—as set by the market—reflect people’s beliefs about how likely it is that each of those outcomes will happen. Say, for example, that the price for a claim that the Miami Heat will win 16 games during the NBA playoffs is $6.50 for a $10 return. That means that, in the collective judgment of the traders, Miami has a 65 percent chance of winning 16 games.

The researchers created two prediction markets via laboratory experiments and studied two others in the real world. In one lab experiment, which took place in 2006, volunteers traded claims on how many games an NBA team would win during the 2006 playoffs and how many goals a team would score in the 2006 World Cup. The volunteers traded claims on 16 teams each for the NBA playoffs and the World Cup.

In the basketball case, one group of volunteers was asked to bet on whether the Miami Heat would win 4–7 playoff games, 8–11 games, or some other range. Another group was given a range of 4–11 games, which combined the two intervals offered to the first group. Then, the volunteers traded claims on each of the intervals within their respective groups. As with all prediction markets, the price of a traded claim reflected the traders’ estimations of whether the total number of games won by the Heat would fall within a particular range.

Economic theory says that the first group’s perceived probability of the Heat winning 4–7 games and its perceived probability of winning 8–11 games should add up to a total close to the second group’s perceived probability of the team winning 4–11 games. But when they added the numbers up, the researchers found instead that the first group thought the likelihood of the team winning 4–7 or 8–11 games higher than did the second group, which was asked about the probability of them winning 4–11 games. All of this suggests that framing the possible outcomes in terms of more specific intervals caused people to think that those outcomes were more likely.

The researchers observed similar results in a second, similar lab experiment, and in two studies of natural markets—one involving a series of 153 prediction markets run by Deutsche Bank and Goldman Sachs, and another involving long-shot horses in horse races.

People tend to bet more money on a long-shot horse, because of its higher potential payoff, and they also tend to overestimate the chance that such a horse will win. Statistically, however, a horse’s chance of winning a particular race is the same regardless of how many other horses it’s racing against—a horse who habitually wins just five percent of the time will continue to do so whether it is racing against fields of 5 or of 11. But when the researchers looked at horse-race data from 1992 through 2001—a total of 6.3 million starts—they found that bettors were subject to the partition bias, believing that long-shot horses had higher odds of winning when they were racing against fewer horses.

While partition dependence has been looked at in the past in specific lab experiments, it hadn’t been studied in prediction markets, Camerer says. What makes this particular analysis powerful is that the researchers observed evidence for this phenomenon in a wide range of studies—short, well-controlled laboratory experiments; markets involving intelligent, well-informed traders at major financial institutions; and nine years of horse-racing data.

The title of the PNAS paper is “How psychological framing affects economic market prices in the lab and field.” In addition to Camerer, the other authors are Ulrich Sonnemann and Thomas Langer at the University of Münster, Germany, and Craig Fox at UCLA. Their research was supported by the German Research Foundation, the National Science Foundation, the Gordon and Betty Moore Foundation, and the Human Frontier Science Program.

Related Situationist Posts

Posted in Behavioral Economics, Social Psychology | Leave a Comment »

Independence Day: Celebrating Courage to Challenge the Situation

Posted by The Situationist Staff on July 3, 2013

First Published on July 3, 2007:

Battle of Lexington

With the U.S. celebrating Independence Day — carnivals, fireworks, BBQs, parades and other customs that have, at best, only a tangential connection to our “independence,” — we thought it an opportune moment to return to its source in search of some situationism. No doubt, the Declaration of Independence is typically thought of as containing a dispositionist message (though few would express it in those terms) — all that language about individuals freely pursuing their own happiness. Great stuff, but arguably built on a dubious model of the human animal.

Declaration of IndependenceThat’s not the debate we want to provoke here. Instead, we are interested in simply highlighting some less familiar language in that same document that reveals something special about the mindset and celebrated courage of those behind the colonists’ revolt. Specifically, as Thomas Jefferson penned, “all experience hath shewn that mankind are more disposed to suffer, while evils are sufferable than to right themselves by abolishing the forms to which they are accustomed.”

Part of what made the July 4th heroes heroic, in our view, was their willingness to break from that disposition to suffer evils. They reacted, mobilized, strategized, resisted, and fought because they recognized that their suffering was not legitimate — a conclusion that many in the U.S. and abroad vehemently rejected.

Situationist contributor John Jost has researched and written extensively about a related topic — the widespread tendency to justify existing systems of power despite any unfair suffering that they may entail. As he and his co-authors recently summarized:

Whether because of discrimination on the basis of race, ethnicity, religion, social class, gender, or sexual orientation or because of policies and programs that privilege some at the expense of others, or even because of historical accidents, genetic disparities, or the fickleness of fate, certain social systems serve the interests of some stakeholders better than others. Yet historical and social scientific evidence shows that most of the time the majority of people – regardless of their own social class or position – accept and even defend the legitimacy of their social and economic systems and manage to maintain a “belief in a just world.”

If we truly want to emulate and celebrate the “founding fathers” of this republic, perhaps we should begin by taking seriously the possibility that what “is” is not always what “ought to be.”

Happy Fourth!

* * *

To read a couple of related Situationist posts, see “Thanksgiving as “System Justification”?” and “Patriots Lose: Justice Restored!

Posted in History, Ideology, Situationist Contributors, Social Psychology | Tagged: , , | Leave a Comment »

The Situation Cheating Students

Posted by The Situationist Staff on June 29, 2013

honor or cheating

From American Psychological Association (excerpts from an article by Ann Novotney):

More than half of teenagers say they have cheated on a test during the last year — and 34 percent have done it more than twice — according to a survey of 40,000 U.S. high school students released in February by the nonprofit Josephson Institute of Ethics. The survey also found that one in three students admitted they used the Internet to plagiarize an assignment.

The statistics don’t get any better once students reach college. In surveys of 14,000 undergraduates conducted over the past four years by Donald McCabe, PhD, a business professor at Rutgers University and co-founder of Clemson University’s International Center for Academic Integrity, about two-thirds of students admit to cheating on tests, homework and assignments. And in a 2009 study in Ethics & Behavior (Vol. 19, No. 1), researchers found that nearly 82 percent of a sample of college alumni admitted to engaging in some form of cheating as undergraduates.

Some research even suggests that academic cheating may be associated with dishonesty later in life. In a 2007 survey of 154 college students, Southern Illinois University researchers found that students who plagiarized in college reported that they viewed themselves as more likely to break rules in the workplace, cheat on spouses and engage in illegal activities (Ethics & Behavior, Vol. 17, No. 3). A 2009 survey, also by the Josephson Institute of Ethics, reports a further correlation: People who cheat on exams in high school are three times more likely to lie to a customer or inflate an insurance claim compared with those who never cheated. High school cheaters are also twice as likely to lie to or deceive their boss and one-and-a-half times more likely to lie to a significant other or cheat on their taxes.

Academic cheating, therefore, is not just an academic problem, and curbing this behavior is something that academic institutions are beginning to tackle head-on, says Stephen F. Davis, PhD, emeritus professor of psychology at Emporia State University and co-author of “Cheating in School: What We Know and What We Can Do” (Wiley-Blackwell, 2009). New research by psychologists seems to suggest that the best way to prevent cheating is to create a campus-wide culture of academic integrity.

“Everyone at the institution — from the president of the university and the board of directors right on down to every janitor and cafeteria worker — has to buy into the fact that the school is an academically honest institution and that cheating is a reprehensible behavior,” Davis says.

Why students cheat

The increasing amount of pressure on students to succeed academically — in efforts to get into good colleges, graduate schools and eventually to land good jobs — tends to be one of the biggest drivers of cheating’s proliferation. Several studies show that students who are more motivated than their peers by performance are more likely to cheat.

“What we show is that as intrinsic motivation for a course drops, and/or as extrinsic motivation rises, cheating goes up,” says Middlebury College psychology professor Augustus Jordan, PhD, who led a 2005 study on motivation to cheat (Ethics and Behavior, Vol. 15, No. 2). “The less a topic matters to a person, or the more they are participating in it for instrumental reasons, the higher the risk for cheating.”

Psychological research has also shown that dishonest behaviors such as cheating actually alter a person’s sense of right and wrong, so after cheating once, some students stop viewing the behavior as immoral. In a study published in March in Personality and Social Psychology Bulletin (Vol. 37, No. 3), for example, Harvard University psychology and organizational behavior graduate student Lisa Shu and colleagues conducted a series of experiments, one of which involved having undergraduates read an honor code reminding them that cheating is wrong and then providing them with a series of math problems and an envelope of cash. The more math problems they were able to answer correctly, the more cash they were allowed to take. In one condition, participants reported their own scores, which gave them an opportunity to cheat by misreporting. In the other condition, participants’ scores were tallied by a proctor in the room. As might be expected, several students in the first condition inflated their scores to receive more money. These students also reported a greater degree of cheating acceptance after participating in the study than they had prior to the experiment. They also found that, while those who read the honor code were less likely to cheat, the honor code did not eliminate all of the cheating,

“Our findings confirm that the situation can, in fact, impact behavior and that people’s beliefs flex to align with their behavior,” Shu says.

Another important finding is that while many students understand that cheating is against the rules, most still look to their peers for cues as to what behaviors and attitudes are acceptable, says cognitive psychologist David Rettinger, PhD, of the University of Mary Washington. Perhaps not surprisingly, he says, several studies suggest that seeing others cheat increases one’s tendency to cheat.

“Cheating is contagious,” says Rettinger. In his 2009 study with 158 undergraduates, published in Research in Higher Education (Vol. 50, No. 3), he found that direct knowledge of others’ cheating was the biggest predictor of cheating.

Even students at several U.S. military academies — where student honor codes are widely publicized and strictly enforced — aren’t immune from cheating’s contagion. A longitudinal study led by University of California, Davis, economist Scott Carrell, PhD, examined survey data gathered from the U.S. Military Academy at West Point, U.S. Naval Academy and U.S. Air Force Academy from 1959 through 2002. Carrell found that, thanks to peer effects, one new college cheater is “created” through social contagion for every two to three additional high school cheaters admitted to a service academy.

“This behavior is most likely transmitted through the knowledge that other students are cheating,” says Carrell, who conducted the study with James West, PhD, and Frederick Malmstrom, PhD, both of the Air Force Academy. “This knowledge causes students — particularly those who would not have otherwise — to cheat because they feel like they need to stay competitive and because it creates a social norm of cheating.”

Dishonesty prevention

Peer effects, however, cut both ways, and getting students involved in creating a culture of academic honesty can be a great way to curb cheating.

“The key is to create this community feeling of disgust at the cheating behavior,” says Rettinger. “And the best way to do that is at the student level.”

* * *

Teachers can also help diminish students’ impulse to cheat by explaining the purpose and relevance of every academic lesson and course assignment, says University of Connecticut educational psychologist Jason Stephens, PhD. According to research presented in 2003 by Stephens and later published in the “The Psychology of Academic Cheating” (Elsevier, 2006), high school students cheat more when they see the teacher as less fair and caring and when their motivation in the course is more focused on grades and less on learning and understanding. In addition, in a 1998 study of cheating with 285 middle school students, Ohio State University educational psychologist Eric Anderman, PhD, co-editor with Tamara Murdock, PhD, of “The Psychology of Academic Cheating,” found that how teachers present the goals of learning in class is key to reducing cheating. Anderman showed that students who reported the most cheating perceive their classrooms as being more focused on extrinsic goals, such as getting good grades, than on mastery goals associated with learning for its own sake and continuing improvement (Journal of Educational Psychology, Vol. 90, No. 1).

“When students feel like assignments are arbitrary, it’s really easy for them to talk themselves into not doing it by cheating,” Rettinger says. “You want to make it hard for them to neutralize by saying, ‘This is what you’ll learn and how it’s useful to you.’”

At the college level in particular, it’s also important for institutional leaders to make fairness a priority by having an office of academic integrity to communicate to students and faculty that the university takes the issue of academic dishonesty seriously, says Tricia Bertram Gallant, PhD, academic integrity coordinator at the University of California, San Diego, and co-author with Davis of “Cheating in School.” . . .

* * *

There’s also evidence that focusing on honesty, trust, fairness, respect and responsibility and promoting practices such as effective honor codes can make a significant difference in student behaviors, attitudes and beliefs, according to a 1999 study by the Center for Academic Integrity. Honor codes seem to be particularly salient when they engage students, however. In Shu’s study on the morality of cheating, for example, she found that participants who passively read a generic honor code before taking a test were less likely to cheat on the math problems, though this step did not completely curb cheating. Among those who signed their names attesting that they’d read and understood the honor code, however, no cheating occurred.

“It was impressive to us how exposing participants to an honor code and really making morality salient in that situation basically eliminated cheating altogether,” she says.

Read entire article here.

Related Situationist posts:

Posted in Education, Morality, Social Psychology | Leave a Comment »

Dan Ariely on the Psychology of Cheating

Posted by The Situationist Staff on June 6, 2013

Behavioral economist Dan Ariely studies the bugs in our moral code: the hidden reasons we think it’s OK to cheat or steal (sometimes). Clever studies help make his point that we’re predictably irrational — and can be influenced in ways we can’t grasp.

Related Situationist posts:

Posted in Morality, Social Psychology, Video | 1 Comment »

The Relational Situation of Whistle-Blowing and Ethical Behavior

Posted by The Situationist Staff on May 30, 2013

whistleEarlier this week, NPR broadcast an excellent (situationist) story titled  “Why Do Whistle-Blowers Become Whistle-Blowers?” by David Greene and Shankar Vedantam.  In it, they discussed recent research by David Mayer and his co-authors (Mayer, D. M., Nurmohamed, S., Treviño, L. K., Shapiro, D. L., & Schminke, M. 2013. Encouraging employees to report unethical conduct internally: It takes a village. Organizational Behavior and Human Decision Processes, 121: 89-103).

Listen to their story by clicking here.

Related Situationist posts:

Posted in Morality, Social Psychology | Leave a Comment »

Dan Ariely Interviewed about the Situation of Cheating

Posted by The Situationist Staff on May 28, 2013

ariely honesty coverFor Time, Gary Belsky recently interviewed Dan Ariely about Ariely’s 2012 book, The (Honest) Truth About Dishonesty.  In the interview, Ariely discusses seven lessons about dishonesty.  Here are some excerpts.

1. Most of us are 98-percenters.

“A student told me a story about a locksmith he met when he locked himself out of the house. This student was amazed at how easily the locksmith picked his lock, but the locksmith explained that locks were really there to keep honest people from stealing. His view was that 1% of people would never steal, another 1% would always try to steal, and the rest of us are honest as long as we’re not easily tempted. Locks remove temptation for most people. And that’s good, because in our research over many years, we’ve found that everybody has the capacity to be dishonest and almost everybody is at some point or another.”

2. We’ll happily cheat … until it hurts.

“The Simple Model of Rational Crime suggests that the greater the reward, the greater the likelihood that people will cheat. But we’ve found that for most of us, the biggest driver of dishonesty is the ability to rationalize our actions so that we don’t lose the sense of ourselves as good people. In one of our matrix experiments [a puzzle-solving exercise Ariely uses in his work to measure dishonesty], the level of cheating didn’t change as the reward for cheating rose. In fact, the highest payout resulted in a little less cheating, probably because the amount of money got to be big enough that people couldn’t rationalize their cheating as harmless. Most people are able to cheat a little because they can maintain the sense of themselves as basically honest people. They won’t commit major fraud on their tax returns or insurance claims or expense reports, but they’ll cut corners or exaggerate here or there because they don’t feel that bad about it.”

3. It’s no wonder people steal from work.

“In one matrix experiment, we added a condition where some participants were paid in tokens, which they knew they could quickly exchange for real money. But just having that one step of separation resulted in a significant increase in cheating. Another time, we surveyed golfers and asked which act of moving a ball illegally would make other golfers most uncomfortable: using a club, their foot or their hand. More than twice as many said it would be less of a problem — for other golfers, of course — to use their club than to pick the ball up. Our willingness to cheat increases as we gain psychological distance from the action. So as we gain distance from money, it becomes easier to see ourselves as doing something other than stealing. That’s why many of us have no problem taking pencils or a stapler home from work when we’d never take the equivalent amount of money from petty cash. . . .”

4. Beware the altruistic crook.

“People are able to cheat more when they cheat for other people. In some experiments, people cheated the most when they didn’t benefit at all. This makes sense if our ability to be dishonest is increased by the ability to rationalize our behavior. If you’re cheating for the benefit of another entity, your ability to rationalize is enhanced. So yes, it’s easier for an accountant to see fudging on clients’ tax returns as something other than dishonesty. And it’s a concern within companies, since people’s altruistic tendencies allow them to cheat more when it benefits team members.”

5. One (dishonest) thing leads to another.

“Small dishonesties matter because they can lead to larger ones. Once you behave badly, at some point, you stop thinking of yourself as a good person at that level and you say, What the hell. This is something many people are familiar with in dieting. We’re disciplined until we lapse, and if we can’t think of ourselves as good people, then we figure we might as well enjoy it. And it happens with honesty as well. Cheaters too can start with one step. We conducted an experiment where participants were given designer sunglasses to wear and evaluate. Some were told their pair was authentic, others were told they were wearing fakes and others were given no information. Then, after they had been wearing their glasses for a while, we gave them matrices to solve. In all three groups, a significant portion of the participants reported solving a few more matrices than they actually had. Moderate cheating, as usual. But while 30% of the group wearing real designer sunglasses cheated, and slightly more, around 40%, of the people in the no-information group cheated, more than 70% of the group wearing the fakes exaggerated the number of matrices they solved. One moral violation leads to further immorality.”

6. Better to encourage honesty than discourage cheating.

“Most attempts to limit cheating come from a cost-benefit understanding of the problem. We think if we make the punishments harsh enough, people will cheat less. But there is no evidence that this approach works. Think of the death penalty. There is no evidence that it decreases crime. A better approach would be to ask, How can we help people stay honest? When we had an insurance company move the signature on a mileage reporting form from the bottom of the document to the top — so people were attesting that the information they were reporting was true before they filled out the form, rather than after — the amount of cheating went down by about 15%.”

7. Honesty is a state of mind.

“In one of our experiments, we split participants into two groups. We asked one group to try to recall the 10 Commandments, the other 10 books they read in high school. Then we had everyone do some matrices. What we found was that the people in the group who recalled books engaged in the same level of cheating as most people. But the participants in the group that tried to remember the 10 Commandments didn’t cheat at all. Small reminders of ethical standards can be very powerful.”

Read entire interview here.

Posted in Ideology, Positive Psychology, Social Psychology | Leave a Comment »

The Stereotyped Situation of Dumb Jocks

Posted by The Situationist Staff on May 4, 2013

dumb jockFrom Michigan State News:

College coaches who emphasize their players’ academic abilities may be the best defense against the effects of “dumb jock” stereotypes, a Michigan State University study suggests.

Researchers found that student-athletes were significantly more likely to be confident in the classroom if they believed their coaches expected high academic performance, not just good enough grades to be eligible for sports.

“Coaches spend a lot of time with their players, and they can play such an important role to build academic confidence in student-athletes,” said lead author Deborah Feltz, University Distinguished Professor of kinesiology at MSU.

Published in the Journal of College Student Development, the study focused on the concept of “stereotype threat.” The theory holds that stereotypes are self-fulfilling prophecies: They create anxiety in the stereotyped group, causing them to behave in the expected way.

Feltz and her graduate students wanted to see what factors influence student-athletes’ susceptibility to the “dumb jock” stereotype.

“It’s well-documented in the literature that many student-athletes hear prejudicial remarks from professors who say things like, ‘This test is easy enough that even an athlete could pass it,’” Feltz said. “They’re kind of the last group of students who can be openly discriminated against.”

The researchers surveyed more than 300 student-athletes representing men’s and women’s teams from small and large universities and a range of sports, from basketball and football to cross-country and rowing.

They found the more strongly student-athletes identified themselves as athletes, the less confident they were with their academic skills, and the more keenly they felt that others expected them to do poorly in school. Players in high-profile sports were more likely to feel they were weak students.

Feltz said the data suggest that coaches who put a premium on education may be in the best position to boost their players’ confidence in the classroom, but professors, academic advisers and classmates also have a part to play.

“They don’t have to do much,” she said. “It may be enough to just remind players they are college students, which is a big deal, you know? A lot of these students are the first in their family to go to college.”

Related Situationist posts:

Image by Les Stockton.

Posted in Implicit Associations, Positive Psychology, Situationist Sports, Social Psychology | Leave a Comment »

The Helpful Crisis in Psychology

Posted by The Situationist Staff on May 1, 2013

From The New Yorker, excerpts from an outstanding article by by :

According to the headlines, social psychology has had a terrible year—and, at any rate, a bad week. The New York Times Magazine devoted nearly seven thousand words to Diederik Stapel, the Dutch researcher who committed fraud in at least fifty-four scientific papers, while Nature just published a report about another controversy, questioning whether some well-known “social-priming” results from the social psychologist Ap Dijksterhuis are replicable. Dijksterhuis famously found that thinking about a professor before taking an exam improves your performance, while thinking about a soccer ruffian makes you do worse. Although nobody doubts that Dijksterhuis ran the experiment that he says he did, it may be that his finding is either weak, or simply wrong—perhaps the peril of a field that relies too heavily on the notion that if something is statistically likely, it can be counted on.

Things aren’t quite as bad as they seem, though. Although Natures report was headlined “Disputed results a fresh blow for social psychology,” it scarcely noted that there have been some replications of experiments modelled on Dijksterhuis’s phenomenon. His finding could still out turn to be right, if weaker than first thought. More broadly, social priming is just one thread in the very rich fabric of social psychology. The field will survive, even if social priming turns out to have been overrated or an unfortunate detour.

Even if this one particular line of work is under a shroud, it is important not to lose sight of the fact many of the old standbys from social psychology have been endlessly replicated, like the Milgram effect—the old study of obedience in which subjects turned up electrical shocks (or what they thought were electrical shocks) all the way to four hundred and fifty volts, apparently causing great pain to their subjects, simply because they’d been asked to do it. Milgram himself replicated the experiment numerous times, in many different populations, with groups of differing backgrounds. It is still robust (in hands of other researchers) nearly fifty years later. And even today, people are still extending that result; just last week I read about a study in which intrepid experimenters asked whether people might administer electric shocks to robots, under similar circumstances. (Answer: yes.)

More importantly, there is something positive that has come out of the crisis of replicability—something vitally important for all experimental sciences.

Read the rest of the article, including more about the importance of this shift toward encouraging replication, here.

Related Situationist posts:

Posted in Classic Experiments, Social Psychology | Leave a Comment »

Not Your Granparents’ Prejudice

Posted by The Situationist Staff on April 26, 2013

Blind Spot Book CoverFrom NPR’s Code Switch (by Shankar Vedantam) a story about Situationist Contributor Mahzarin Banaji and Situationist friend Tony Greenwald.

Harvard psychologist Mahzarin Banaji was once approached by a reporter for an interview. When Banaji heard the name of the magazine the reporter was writing for, she declined the interview: She didn’t think much of the magazine and believed it portrayed research in psychology inaccurately.

But then the reporter said something that made her reconsider, Banaji recalled: “She said, ‘You know, I used to be a student at Yale when you were there, and even though I didn’t take a course with you, I do remember hearing about your work.’ “

The next words out of Banaji’s mouth: “OK, come on over; I’ll talk to you.”

After she changed her mind, got to thinking. Why had she changed her mind? She still didn’t think much of the magazine in which the article would appear. The answer: The reporter had found a way to make a personal connection.

For most people, this would have been so obvious and self-explanatory it would have required no further thought. Of course, we might think. Of course we’d help someone with whom we have a personal connection.

For Banaji, however, it was the start of a psychological exploration into the nature and consequences of favoritism — why we give some people the kind of extra-special treatment we don’t give others.

In a new book, , Banaji and her co-author, Anthony Greenwald, a social psychologist at the University of Washington, turn the conventional way people think about prejudice on its head. Traditionally, Banaji says, psychologists in her field have looked for overt “acts of commission — what do I do? Do I go across town to burn down the church of somebody who’s not from my denomination? That, I can recognize as prejudice.”

Yet, far from springing from animosity and hatred, Banaji and Greenwald argue, prejudice may often stem from unintentional biases.

Take Banaji’s own behavior toward the reporter with a Yale connection. She would not have changed her mind for another reporter without the personal connection. In that sense, her decision was a form of prejudice, even though it didn’t feel that way.

Now, most people might argue such favoritism is harmless, but Banaji and Greenwald think it might actually explain a lot about the modern United States, where vanishingly few people say they hold explicit prejudice toward others but wide disparities remain along class, and gender lines.

Anthony Greenwald is a social psychologist and a professor at the University of Washington.

Jean Alexander Greenwald/Delacorte Press

The two psychologists have revolutionized the scientific study of prejudice in recent decades, and their — which measures the speed of people’s hidden associations — has been applied to the practice of , law and other fields. Few would doubt its impact, including . (I’ve written about and Greenwald’s work before, in this and in my 2010 book, .)

“I think that kind of act of helping towards people with whom we have some shared group identity is really the modern way in which discrimination likely happens,” Banaji says.

In many ways, the psychologists’ work mirrors the conclusion of another recent book: In , sociologist asks how it is that few people report feeling racial prejudice, while the United States still has enormous disparities. Discrimination today is less about treating people from other groups badly, DiTomaso writes, and more about giving preferential treatment to people who are part of our “in-groups.”

The insidious thing about favoritism is that it doesn’t feel icky in any way, Banaji says. We feel like a great friend when we give a buddy a foot in the door to a job interview at our workplace. We feel like good parents when we arrange a class trip for our daughter’s class to our place of work. We feel like generous people when we give our neighbors extra tickets to a sports game or a show.

In each case, however, Banaji, Greenwald and DiTomaso might argue, we strengthen existing patterns of advantage and disadvantage because our friends, neighbors and children’s classmates are overwhelmingly likely to share our own racial, religious and socioeconomic backgrounds. When we help someone from one of these in-groups, we don’t stop to ask: Whom are we not helping?

Banaji tells a story in the book about a friend, , now a professor at Northeastern University. . . .

Read or listen to the rest of the story here.

Related Situationist posts:

Go to Project Implicit here.  Take the Policy IAT here.

To review all of the previous Situationist posts discussing implicit associations click on the “Implicit Associations” category in the right margin, or, for a list of such posts, click here.

Learn more about the book, Blind Spot, here.

Posted in Book, Implicit Associations, Life, Marketing, Situationist Contributors, Social Psychology | Leave a Comment »

The Interior Situation of the Climate Change Skeptic

Posted by The Situationist Staff on April 23, 2013

global warming from davidllorito.blogspot.com/search/label/governanceFrom the APS Observer, an article by Situationist Contributor John T. Jost and Erin P. Hennes

A multitude of environmental scientists, among others, worry that future generations will look back at the present era as one in which the human race could have — and should have —taken decisive action to prevent (or at least mitigate) the most menacing costs associated with global climate change. According to public opinion surveys, however, only 38 percent of Americans believe that global warming will seriously affect them or their way of life (Newport, 2012), and 42 percent continue to believe that global warming claims are “generally exaggerated” (Saad, 2012). When it comes to beliefs about climate change, men are more skeptical than women, and political conservatives are more skeptical than liberals. In a Gallup survey conducted in 2010, 42 percent of men and only 30 percent of conservatives agreed that “effects of global warming are already occurring,” as compared to 56 percent of women and 74 percent of liberals (Jones, 2010; see also McCright & Dunlap, 2011).

In a recent book, the philosopher Stephen Gardiner (2011) argues that environmental inaction is the consequence of a “perfect moral storm.” Specifically, he points to the conjunction of three unfortunate causes: 1) a tendency for the richer nations of the world to foist the burden of environmental risks upon poorer nations; 2) the present generation’s temptation to defer the costs of the crisis to future generations; and 3) pervasive ignorance concerning science, ethics, international justice, and the interdependence of life. Gardiner writes that the last factor “not only complicates the task of behaving well, but also renders us more vulnerable to the first two storms” (p. 7). Gardiner provides an astute analysis of the problem of environmental inaction, but he overlooks the possibility that climate change denial may not merely result from ignorance. Rather, many members of the public may possess a relatively strong motivation to deny and minimize environmental realities. Specifically, our research team has found that the social psychological motivation to defend, bolster, and justify aspects of the status quo — what we refer to as system justification (see, e.g., Jost, Banaji, & Nosek, 2004) — contaminates the public’s understanding of anthropogenic climate change.

In research published in 2010, we discovered that individuals who score higher on Kay and Jost’s (2003) General System Justification scale (which measures responses to statements such as “Most policies serve the greater good,” and “In general, the American political system operates as it should”) exhibit greater denial of environmental problems and vulnerabilities. Furthermore, system justification statistically mediates the effects of gender and political ideology on support for the environment. That is, men and conservatives are more likely than women and liberals to believe that American society is fair and legitimate, and these differences in system justification explain, at least in part, why they are so skeptical about climate change and are reluctant to take pro-environmental action (Feygina, Jost, & Goldsmith, 2010; see also Feinberg & Willer, 2011).

More recently, we have conducted a series of studies corroborating the hypothesis that system justification motivates skepticism about climate change. Specifically, we have found that the denial of environmental problems is facilitated by information-processing distortions associated with system justification that affect evaluation, recall, and even tactile perception (Hennes, Feygina, & Jost, 2011). In one study, we found that individuals who scored higher (vs. lower) on Jost and Thompson’s (2000) Economic System Justification scale (which measures responses to such statements as “If people work hard, they almost always get what they want,” and “It is unfair to have an economic system which produces extreme wealth and extreme poverty at the same time,” reverse-scored) found messages disparaging the case for global warming to be more persuasive, evaluated the evidence for global warming to be weaker, and expressed less willingness to take action to curb global warming.

In a second study, we extended these findings by demonstrating that motivated processing biases recall of information about climate change. Specifically, we exposed research participants to clips from a televised newscast and later asked them to recall details from the program and to evaluate scientific evidence concerning climate change. Once again, we found that high system-justifiers evaluated the quality of the evidence to be weaker, were less likely to believe that climate change is occurring, and viewed it as a less important policy issue, in comparison with low system-justifiers. High system-justifiers also recalled the information to which they had been exposed as less serious (i.e., remembering smaller increases in global temperatures, lower sea levels, and less reliable historical data concerning climate change) than did low system-justifiers. Poorer recall was associated with skepticism about climate change. Thus, individuals who misremembered the evidence provided in the video to be less severe were less likely to support efforts to address climate change.

In an experimental investigation, we demonstrated that temporarily activating system-justification motivation produced memory biases and exacerbated skepticism about global climate change. More specifically, we adapted a system-dependence manipulation developed by Kay, Gaucher, Peach et al. (2009; see also Shepherd & Kay, 2012) and found that when people were led to believe that the political system exerted a strong (vs. weak) impact on their life circumstances, they were more likely to misremember details from a newspaper article they read earlier in the session. Importantly, all of the memory errors were in a system-exonerating direction: The proportion of man-made carbon emissions was recalled as being less than actually reported, and the scientists who reported errors in the much-maligned 2007 report by the Intergovernmental Panel on Climate Change were misidentified as skeptics rather than believers in anthropogenic climate change (Hennes et al., 2011).

We have discovered that system-justification motivation can even affect perceptions of ambient temperature. Our research assistants approached pedestrians in New York’s Washington Square Park during the summer months and asked them a series of questions, including their estimates of the temperature outside. Individuals who scored high on system justification or who were assigned to a high system-dependence condition reported that the current temperature was significantly lower than did individuals who scored low on system justification or who were assigned to a low system-dependence condition. These findings suggest that people may be motivated to feel (or not feel) the evidence of global warming when system-justification needs are either chronically or temporarily heightened.

Berkeley physicist Richard Muller, a former skeptic of anthropogenic climate change, made headlines last summer when he declared that not only is climate change real, but that “humans are almost entirely the cause” (Muller, 2012). If catastrophic events like Hurricane Sandy become more common, they may shift hearts and minds, albeit slowly. Given economic and other crises facing the nation (many of which probably exacerbate system-justification motivation), it still remains to be seen whether Americans and their elected officials will follow suit in embracing the scientific consensus. Climate change was a non-issue during the 2012 election campaign, and President Obama (2013) was criticized resoundingly by Senator Marco Rubio and other conservatives for emphasizing the issue in his most recent State of the Union speech. Suffice it to say that neither politicians nor the voters who back them appreciate the suggestion that the opinions they hold are motivated, even in part, by social and psychological factors that are probably outside of their awareness. American society and many others have yet to find a way of allowing the facts — scientific and otherwise — to trump special interests, political posturing, and motivated reasoning when it comes to the development of public policy. But that doesn’t mean we should stop trying.

References and Further Reading:

Carroll, J. (2007). Public: Iraq war still top priority for President and Congress. Gallup Poll. Retrieved April 9, 2007, from http://www.galluppoll.com/content/?ci=27103&pg=1

Feinberg, M., & Willer, R. (2011). Apocalypse soon? Dire messages reduce belief in global warming by contradicting just world beliefs. Psychological Science, 22, 34–38.

Feygina, I., Jost, J. T., & Goldsmith, R. (2010). System justification, the denial of global warming, and the possibility of “system-sanctioned change.” Personality and Social Psychology Bulletin. 36, 326–338.

Hennes, E. P., Feygina, I., & Jost, J. T. (2011). Motivated evaluation, recall, and tactile perception in the service of the system: The case of anthropogenic climate change. Paper presented at the Princeton University Conference on Psychology and Policymaking, Princeton, NJ.

Jones, J. M. (2010). Conservatives’ doubts about global warming grow. Gallup Poll. Retrieved August 14, 2012, from http://www.gallup.com/poll/126563/conservatives-doubts-global-warming-grow.aspx

Jost, J. T., Banaji, M. R., Nosek, B. A. (2004). A decade of system justification theory: Accumulated evidence of conscious and unconscious bolstering of the status quo. Political Psychology, 25, 881–919.

Jost, J. T., & Thompson, E. P. (2000). Group-based dominance and opposition to equality as independent predictors of self-esteem, ethnocentrism, and social policy attitudes among African Americans and European Americans. Journal of Experimental Social Psychology, 36, 209–232.

Kay, A. C., & Jost, J. T. (2003). Complementary justice: Effects of “poor but happy” and “poor but honest” stereotype exemplars on system justification and implicit activation of the justice motive. Journal of Personality and Social Psychology, 85, 823–837.

McCright, A. M., & Dunlap, R. E. (2011). Cool dudes: The denial of climate change among conservative white males in the United States. Global Environmental Change, 21, 1163–1172.

Muller, R. A. (2012, July 30). The conversion of a climate-change skeptic. New York Times, p. A19.

Newport, F. (2012). Amercans’ worries about global warming up slightly. Gallup Poll. Retrieved January 28, 2013, from http://www.gallup.com/poll/153653/Americans-Worries-Global-Warming-Slightly.aspx

Obama, B. (2012). State of the union address. Retrieved March 6, 2013, from http://www.nytimes.com/2013/02/13/us/politics/obamas-2013-state-of-the-union-address.html?pagewanted=1&_r=2

Saad, L. (2012). In U.S., global warming views steady despite warm winter. Gallup Poll. Retrieved January 28, 2013, from http://www.gallup.com/poll/153608/Global-Warming-Views-Steady-Despite-Warm-Winter.aspx

Shepherd, S., & Kay, A. C. (2012). On the perpetuation of ignorance: System dependence, system justification, and the motivated avoidance of sociopolitical information. Journal of Personality and Social Psychology, 102, 264–80.

Related Situationist posts:

Posted in Environment, Ideology, Politics, Public Policy, Situationist Contributors, Social Psychology, System Legitimacy | 2 Comments »

Posts on the Situation of Evil

Posted by The Situationist Staff on April 19, 2013

Evil Disposition

For readers interested in previous posts about the situation of evil behavior, here are links to a sample:

Posted in Social Psychology | Leave a Comment »

The Boston Bombings and the Cognitive Limits of Empathy

Posted by The Situationist Staff on April 17, 2013

Boston Marathon 2013

From Situationist friend and Harvard Law School 3L, Kate Epstein, an essay about Monday’s tragedy:

As I hear reactions to the bombings at the marathon on Monday, I find myself agreeing with Glenn Greenwald’s column in The Guardian, titled “The Boston bombing produces familiar and revealing reactions: As usual, the limits of selective empathy, the rush to blame Muslims, and the exploitation of fear all instantly emerge.” Particularly interesting to me are our cognitive limits, as humans, when it comes to empathy. Greenwald writes:

The widespread compassion for yesterday’s victims and the intense anger over the attacks was obviously authentic and thus good to witness. But it was really hard not to find oneself wishing that just a fraction of that compassion and anger be devoted to attacks that the US perpetrates rather than suffers. These are exactly the kinds of horrific, civilian-slaughtering attacks that the US has been bringing to countries in the Muslim world over and over and over again for the last decade, with very little attention paid.

I felt the same way in the aftermath of Monday’s events, but I can also empathize with those who do care more–or at least feel it in a more real way–when the victims of a random act of violence are white, close to home, and so obviously innocent. “They, unlike the countless non-white, non-American casualties of the War on Terror, are– for me and many around me–part of our in-group, and our minds actually function in a way that makes us much more easily empathize with them.”

Studies have shown that parts of our brain associated with empathy and emotion are more likely to be activated when we observe someone of our own race, as opposed to an out-group member, in pain. This makes sense given research on unconscious bias using implicit association tests, which have been shown to predict real-life behavior outside of the lab.

The good news is that our automatic attitudes are sometimes malleable. Awareness of the differences between our egalitarian values and our implicit attitudes can induce emotional reactions that can motivate behavioral changes and help us be the empathetic and altruistic people we hope to be. On the other hand, lack of awareness combined with an inundation of negative images and stereotypes from commercial media and popular culture can reinforce implicit biases, underscoring the need for education and self-awareness.

In a world with so much violence and pain, it makes sense that we simply could not feel deeply empathetic every time a human being is injured or killed. We rightly feel intense moral outrage that someone would senselessly harm innocent people gathered in Boston yesterday, and yet we do not so easily empathize with victims of drone strikes in Pakistan, most of whom see the bombings as just as random and senseless, against victims just as innocent.

We should forgive ourselves for exhibiting these cognitive limits–after all, we are only human. But we should recognize, in these moments when we do so easily feel sorrow, anger, and compassion, those events which do not normally elicit those emotions, and force ourselves to grapple with the consequences of that fact. When we read dry, mundane news reports about human suffering, when we (rarely) hear body counts of the War on Terror (such as the estimated 122,000 violent, civilian deaths in Iraq thus far), when we are made aware of the latest unnamed drone victims in North Waziristan, let’s try to channel the empathy events like this make us feel, and then let’s turn that empathy into action.

Related Situationist posts:

The Situationist has a series of posts devoted to highlighting some of situational sources of war. Part I and Part II of the series included portions of an article co-authored by Daniel Kahneman and Jonathan Renshon, titled “Why Hawks Win.” Part III reproduced an op-ed written by Situationist friend Dan Gilbert on July 24, 2006. Part IV and Part V in this series contained the two halves of an essay written by Situationist Contributor, Jon Hanson within the week following 9/11. Part VI contains an op-ed written by Situationist Contributor John Jost on October 1, 2001, “Legitimate Responses to Illegitimate Acts,” which gives special emphasis to the role of system justification. Part VII includes a video entitled “Resisting the Drums of War.” The film was created and narrated by psychologist Roy J. Eidelson, Executive Director of the Solomon Asch Center at the University of Pennsylvania.

To review a larger sample of posts on the causes and consequences of human conflict, click here.

Posted in Altruism, Conflict, Emotions, Implicit Associations, Social Psychology | 2 Comments »

 
Follow

Get every new post delivered to your Inbox.

Join 833 other followers

%d bloggers like this: