The Situationist

The Science of Morality

Posted by The Situationist Staff on November 27, 2007

Time Magazine CoverIn this week’s Time Magazine, Jeffrey Kluger offers an excellent cover story on what neuroscientists and other mind scientists have discovered about how our morality connects to our brains. Below we excerpt a few segments of the story.

* * *

We’re a species that is capable of almost dumbfounding kindness. We nurse one another, romance one another, weep for one another. Ever since science taught us how, we willingly tear the very organs from our bodies and give them to one another. And at the same time, we slaughter one another. The past 15 years of human history are the temporal equivalent of those subatomic particles that are created in accelerators and vanish in a trillionth of a second, but in that fleeting instant, we’ve visited untold horrors on ourselves–in Mogadishu, Rwanda, Chechnya, Darfur, Beslan, Baghdad, Pakistan, London, Madrid, Lebanon, Israel, New York City, Abu Ghraib, Oklahoma City, an Amish schoolhouse in Pennsylvania–all of the crimes committed by the highest, wisest, most principled species the planet has produced. That we’re also the lowest, cruelest, most blood-drenched species is our shame–and our paradox.

The deeper that science drills into the substrata of behavior, the harder it becomes toGorrila Using Tool preserve the vanity that we are unique among Earth’s creatures. We’re the only species with language, we told ourselves–until gorillas and chimps mastered sign language. We’re the only one that uses tools then–but that’s if you don’t count otters smashing mollusks with rocks or apes stripping leaves from twigs and using them to fish for termites.

* * *

“Moral judgment is pretty consistent from person to person,” says Marc Hauser, professor of psychology at Harvard University and author of Moral Minds. “Moral behavior, however, is scattered all over the chart.” The rules we know, even the ones we intuitively feel, are by no means the rules we always follow.

Where do those intuitions come from? And why are we so inconsistent about following where they lead us? Scientists can’t yet answer those questions, but that hasn’t stopped them from looking. Brain scans are providing clues. Animal studies are providing more. Investigations of tribal behavior are providing still more. None of this research may make us behave better, not right away at least. But all of it can help us understand ourselves–a small step up from savagery perhaps, but an important one.

* * *

Nadia Kohts One of the first and most poignant observations of empathy in nonhumans was made by Russian primatologist Nadia Kohts, who studied nonhuman cognition in the first half of the 20th century and raised a young chimpanzee in her home. When the chimp would make his way to the roof of the house, ordinary strategies for bringing him down–calling, scolding, offers of food–would rarely work. But if Kohts sat down and pretended to cry, the chimp would go to her immediately. “He runs around me as if looking for the offender,” she wrote. “He tenderly takes my chin in his palm . . . as if trying to understand what is happening.”

You hardly have to go back to the early part of the past century to find such accounts. Even cynics went soft at the story of Binta Jua, the gorilla who in 1996 rescued a 3-year-old boy who had tumbled into her zoo enclosure, rocking him gently in her arms and carrying him to a door where trainers could enter and collect him. “The capacity of empathy is multilayered,” says primatologist Frans de Waal of Emory University, author of Our Inner Ape. “We share a core with lots of animals.”

* * *

While it’s impossible to directly measure empathy in animals, in humans it’s another matter. Hauser cites a study in which spouses or unmarried couples underwent functional magnetic resonance imaging (fMRI) as they were subjected to mild pain. They were warned before each time the painful stimulus was administered, and their brains lit up in a characteristic way signaling mild dread. They were then told that they were not going kluger-quotation1.pngto feel the discomfort but that their partner was. Even when they couldn’t see their partner, the brains of the subjects lit up precisely as if they were about to experience the pain themselves. “This is very much an ‘I feel your pain’ experience,” says Hauser.

* * *

Pose these dilemmas to people while they’re in an fMRI, and the brain scans get messy. Using a switch to divert the train toward one person instead of five increases activity in the dorsolateral prefrontal cortex–the place where cool, utilitarian choices are made. Complicate things with the idea of pushing the innocent victim, and the medial frontal cortex–an area associated with emotion–lights up. As these two regions do battle, we may make irrational decisions. In a recent survey, 85% of subjects who were asked about the trolley scenarios said they would not push the innocent man onto the tracks–even though they knew they had just sent five people to their hypothetical death. “What’s going on in our heads?” asks Joshua Greene, an assistant professor of psychology at Harvard University. “Why do we say it’s O.K. to trade one life for five in one case and not others?”

* * *

For the rest of the article, click here. For a related Situationist post, see “Your Brain and Morality.”

3 Responses to “The Science of Morality”

  1. Dr.Steve said

    I’ve been having and interesting discussion on whether we can think ourselves to moral behavior and so was interested in the piece.
    This quote is intriguing: “None of this research may make us behave better, not right away at least. But all of it can help us understand ourselves–a small step up from savagery perhaps, but an important one.”
    1. Why won’t the research make us better right now?
    2. And if it won’t, why should it do so in the future?
    3. Why is understanding ourselves a step up from savagery?
    Heck, is the suggestion that what the world needs for less savagery is neuroscience? Or that pre-neuroscience societies are savage?
    Don’t be so sure; psychopaths, another interest of mine, won’t be slow to incorporate this knowledge into their amoral quests.

  2. Someone who switched the trolly such that it struck the five rather than the one would be considered to have done a horrifying evil. Say that he or she recognized the one as the President of the United States and the judgments passed upon the act would vary widely. A national funeral for the five victims would likely put all right again with most of the population.

    Until the President is at risk, though, can we truly call this a “moral” dilemma? If the President isn’t at risk, the actor (the person, in this case, who chooses to actively direct the trolly), should he or she fully grasp the situation, reveals him- or herself to be profoundly psychotic or in possession of a valid decision-making process so radically different that we don’t understand it and probably can’t accept it.

    Switching the trolly from the five to the one, on the other hand, is not necessarily considered “moral”. To actively participate in any death (especially the death of someone recognizably a member of some group with which the actor identifies) is so deeply forbidden that no socialized person can be blamed for forgoing it even at the cost of passively participating in the death of five members of the same group.

    This works wonderfully well. Why? Because one often finds oneself in a position to kill a fellow group member and but trolly dilemmas virtually never present themselves in real life. If trolly dilemmas were common, empathy would be trained, during the maturation process, to generalize the situation and to respond in the fashion that the society deemed most constructive — even if it was to switch the trolly toward the five.

    If empathy enters into the decision, does it establish a “moral” basis. While moral constructs may begin with empathy, don’t they have to go well beyond empathy — well beyond a single basic constituent — before they are properly called “moral”? A moral position might be: “Do not take an active part in such situations because each agent — God/nature, man, man’s constructs — has, or is the result of, a free will and must be allowed to suffer the consequences as well as reap the benefits of their actions.”

    There have been quite a number of popular articles virtually identical to this over the past several decades (perhaps longer). But do they really ponder “morality”. Doesn’t morality come into the picture — isn’t it so difficult and frustrating — because it is precisely about going beyond empathy, consciously over-riding empathy because empathy uncorrected by morality is too often destructive? Consciously over-riding the deepest social bonds of groups because they are so too often destructive to the most recent version of an advanced society that corresponds historically to the given morality?

  3. […] see “Law & the Brain,” “Pinker on the Situation of Morality,” “The Science of Morality,” and “Your Brain and […]

Leave a comment