The Situationist

Posts Tagged ‘Neuroscience’

Legal theory must incorporate discoveries from biology and behavioral sciences

Posted by Fábio Portela on October 15, 2013

Some recent discoveries in evolutionary biology, ethology, neurology, cognitive psychology and behavioral economics impels us to rethink the very foundations of law if we want to answer many questions remain unanswered in legal theory. Where does our ability to interpret rules and think in terms of fairness in relation to others come from? Does the ability to reason about norms derive from certain aspects of our innate rationality and from mechanisms that were sculptured in our moral psychology by evolutionary processes?

Legal theory must take the complexity of the human mind into account

Any answer to these foundational issues demands us to take into consideration what these other sciences are discovering about how we behave. For instance, ethology has shown that many moral behaviors we usually think that are uniquely displayed by our species have been identified in other species as well.

Please watch this video, a lecture by primatologist Frans de Waal for the TED Talks :

The skills needed to feel empathy, to engage in mutual cooperation, to react to certain injustices, to form coalitions, to share, to punish those who refuse to comply with expected behaviors, among many others – abilities once considered to be exclusive of humans – have been observed in other animals. These traits have been observed in many animal species, especially those closer to our evolutionary lineage, as the great apes. In the human case, these instinctive elements are also present. Even small children around the age of one year old show great capacity for moral cognition. They know to identify patterns of relationships in distributive justice, even if they cannot explain why they came to a certain conclusion (because they even do not know how to speak by that age!).

In addition, several studies have shown that certain neural connections in our brains are actively involved in processing information related to capabilities typical of normative behavior. Think about the ability to empathize, for example. It is an essential skill that prevents us to see other people as things or means. Empathy is needed to respect the Kantian categorical imperative to treat the others as an end in themselves, and not means to achieve other ends. This is something many psychopaths can’t do, because they face severe reduction in their ability to empathize with others. Several researches using fMRI have shown year after year that many diagnosed psychopaths show deficiencies in areas of their brains that have been associated to empathy.

If this sounds like science fiction, please consider the following cases.

A 40 year old man, who had hitherto displayed absolutely normal sexual behavior, was kicked out by his wife after she discovered what he was visiting child porn sites and had even tried to sexually molest children. He was arrested and the judge determined that he would have to pass through a sexaholics rehabilitation program or face jail. But he soon got expelled from the program after inviting women at the program to have sex with him. Just before being arrested again for failing in the program, he felt a severe headache and went to a hospital, where he was submitted to an MRI exam. The doctors identified a tumor on his orbifrontal cortex, a brain region usually associated with training of moral judgment, impulse control and regulation of social behavior. After the removal of the tumor, his behavior returned to normal. Seven months later, he once more showed deviant behavior – and further tests showed the reappearance of the tumor. After the removal of the new cyst, his sexual behavior again returned to normal standards.

You could also consider the case of Charles Whitman. Until he was 24, he had been a reasonably normal person. However, on August 1st, 1966, he ascended to the top of the Tower of the University of Texas, where, armed to the teeth, he killed 13 people and wounded 32 before being killed by the police. Later it was discovered that just before the mass killings, he had also murdered both his wife and mother. During the previous day, he left a typewritten letter in which one could read the following:

“I do not quite understand what it is that compels me to type this letter. Perhaps it is to leave some vague reason for the actions I have recently performed. I do not really understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I cannot recall when it started) I have been a victim of many unusual and irrational thoughts.”

In the letter, he also requested to be submitted to an autopsy after his death in order to verify if it there was something wrong with his brain.  Whitman’s brain was examined and … surprise! … the doctors found a glioblastoma tumor compressing the region of his amygdala, which is associated with the regulation of aggression and fear.

What does this mean for legal theory? At least this means that law, so far, has been based on a false metaphysical conception that t brain is a lockean blank slate and that our actions derive from our rational dispositions. Criminal law theory assumes that an offender breaks the law exclusively due to his free will and reasoning. Private law assumes that people sign contracts only after considering all its possible legal effects and are fully conscious about the reasons that motivated them to do so. Constitutional theory assumes that everyone is endowed with a rational disposition that enables the free exercise of civil and constitutional rights such as freedom of expression or freedom of religion. It is not in question that we are able to exercise such rights. But these examples show  that the capacity to interpret norms and to act accordingly to the law does not derive from a blank slate endowed with free will and rationality, but from a complex mind that evolved in our hominin lineage and that relies on brain structures that enables us to reason and choose among alternatives.

This means that our rationality is not perfect. It is not only affected by tumors, but also by various cognitive biases that affect the rationality of our decisions. Since the 1970s, psychologists have studied these biases. Daniel Kahneman, for example, won the 2002 Nobel prize in Economic Sciences for his research on the impact of these biases on decision-making. We can make really irrational decisions because our mind is based on certain heuristics (fast-and-frugal rules) to evaluate certain situations. In most situations, these heuristics help us to make the right decisions, but they also may influence us to make really dumb mistakes.

There are dozens of heuristics that structure our rationality. We are terrible on assessing the significance of statistical correlations, we discard unfavorable evidence, we tend to follow the most common behavior in our group (herd effect), and we tend to see past events as if they had been easily predictable. We are inclined to cooperate with whom is part of our group (parochialist bias), but not so with whom belongs to another group. And those are just some of the biases that have been already identified.

It is really hard to overcome these biases, because they are much of what we call rationality. These flaws are an unavoidable part of our rationality. Sure, with some effort, we can avoid many mistakes by using some techniques that could lead us to get unbiased and correct answers. However, using artificial techniques to do so may be expensive and demands lots of effort. We can use a computer and train mathematical skills in order to overcome biases that causes error in statistical evaluation, for instance. But how can we use a computer to reason about morality or legal issues “getting around” these psychological biases? Probably, we can’t.

The best we can do is to reconsider the psychological assumptions of legal theory, by taking into account what we actually know about our psychology and how it affects our judgement. And there is evidence that these biases really influence how judges evaluate judicial cases. For instance, a research done by Birte Englich, Thomas Mussweiler and Fritz Strack concluded that even legal experts are indeed affected by cognitive biases. More specifically, they studied the effects of anchoring bias in judicial activity, by submitting 52 legal experts to the following experiment: they required them to examine an hypothetical court case, which should determine the sentence in a fictitious shoplifting case. After reading the materials, the participants had to answer a questionnaire at the end of which they would define the sentence.

Before answering the questions, however, the participants should throw a pair of dice in order to determine the prosecutor’s demand. Half of the dice were loaded in order to show always the numbers 1 and 2. And the other half was loaded in order to indicate 3 and 6. The sum of the numbers should indicate the prosecutor’s sentencing demand. Afterwards, they should answer questions about legal issues concerning the case, including the sentencing decision. The researchers found that the results of the dice had an actual impact on their proposed sentence: the average penalty imposed by judges who had dice with superior results (3 + 6 = 9) was 7.81 months in prison, while the participants whose dice resulted in lower values ​​(1 +2 = 3) , proposed an average punishment of 5.28 months .

In another study, it was found that, on average, tired and hungry judges end up taking the easy decision to deny parole rather than to grant it. In the study, conducted in Israel, researchers divided the day’s schedule of judges into three sessions. At the beginning of which of them, the participants could rest and eat. It turned out that, soon after eating and resting, judges authorized the parole in 65% of cases. At the end of each session, the rate fell to almost zero. Okay, this is not really a cognitive bias, but a factual condition – however, it shows that a tired mind and energy needs can induce decisions that almost everyone would consider as intrinsically unfair.

And so on. Study after study , research shows that (1) our ability to develop moral reasoning is innate, (2) our mind is filled with innate biases that are needed to process cultural information in relation to compliance with moral/legal norms, and (3) these biases affect our rationality.

These researches raise many questions that will have to be faced sooner or later by legal scholars. Would anyone say that due process of law is respected when judges anchors judicial decision in completely external factors – factors about which they aren’t even aware of! Of course, this experiment was done in a controlled experiment and nobody expects that a judge rolls dice before judging a case. But judge might be influenced by other anchors as well, such as numbers inside a clock, a date on the calendar, or a number printed on a dollar banknote? Or would anyone consider due process was respected even if a parole hadn’t been granted because the case was judged late in the morning? These external elements decisively influenced the judicial outcome, but none of them were mentioned in the decision.

Legal theory needs to incorporate this knowledge on its structure. We need to build institutions capable to take biases into account and, as far as possible, try to circumvent them or, at least, diminish their influence. For instance, by knowing that judges tend to get impatient and harsher against defendants when they are hungry and tired, a Court could force him to take a 30 minute break after 3 hours of work in order to restore their capacity to be as impartial as possible. This is just a small suggestion about how institutions could respond to these discoveries.

Of course, there are  more complex cases, such as the discussion about criminals who always had displayed good behavior, but who were misfortunate to develop a brain tumor that influenced the commitment of a crime. Criminal theory is based on the thesis that the agent must intentionally engage in criminal conduct. But is it is possible to talk about intention when a tumor was one direct cause of the result? And if it hadn’t been a tumor, but a brain malformation (as it occurs in many cases of psychopathy)? Saying that criminal law could already solve these cases by considering that the criminal had no responsibility due to his condition wouldn’t solve the problem, because the issue is in the very concept of intention that is assumed in legal theory.

And this problem expands into the rest of the legal theory. We must take into account the role of cognitive biases in consumer relations. The law has not realized the role of these biases in decision making, but many companies are aware of them. How many times haven’t you bought a 750 ml soda for $2.00 just because it cost $0.20 more than a 500 ml one? Possibly, you thought that you payed less per ml than you would pay if you had bought the smaller size. But … you really wanted was 500 ml, and would pay less than you payed for taking extra soda that you didn’t want! In other words, the company just explores a particular bias that affects most people, in order to induce them to buy more of its products. Another example: for evolutionary reasons, humans are prone to consume fatty foods and lots of sugar. Companies exploit this fact to their advantage, which ends up generating part of the obesity crisis that we see in the world today. In their defense, companies say that consumers purchased the product on their own. What they do not say, but neurosciences and evolutionary theory say, is that our “free will” has a long evolutionary history that propels us to consume exactly these kinds of food that, over the years, affects our health. And law needs to take these facts into consideration if it wants to adequately protect and enforce consumer rights.

Law is still based on an “agency model” very similar to game theory’s assumption of rationality. But we are not rational. Every decision we make is influenced by the way our mind operates. Can we really think that it is fair to blame someone who committed a crime on the basis of erroneous results generated by a cognitive bias? And, on the other hand, would it be right to exonerate a defendant based on those assumptions? To answer these and other fringes questions, legal scholars must rethink the concept of person assumed by law, taking into account our intrinsic biological nature.

Related Situationist posts:

Image from Flickr

Posted in Legal Theory, Morality, Neuroscience, Philosophy | Tagged: , , , , , | 3 Comments »

Dan Rather Reports on the Brain’s Plasticity

Posted by The Situationist Staff on April 20, 2012

Related Situationist posts:

Posted in Neuroscience, Social Psychology, Video | Tagged: , , | Comments Off on Dan Rather Reports on the Brain’s Plasticity

Joshua Buckholtz Comes To Harvard Law – Postponed

Posted by The Situationist Staff on March 29, 2012

Neuroscience, Psychopathology, and Crime
Postponed until fall.
Wasserstein 1023
Friday, March 30, 2012, 12 – 1pm

Why can’t some people stop themselves from doing things that are bad for them? Why can’t some people stop themselves from doing things that hurt others? These questions have puzzled philosophers, economists, and psychologists for centuries. Professor Joshua Buckholtz will discuss these issues in the context of his work at Harvard’s Systems Neuroscience of Psychopathology Lab, where he seeks to understand how genes and environments affect brain chemistry and function to influence variability in human self-control.

Free Chinese food!

Sponsor: Student Association for Law & Mind Sciences

Posted in Choice Myth, Events, Neuroscience, SALMS | Tagged: , , , | Leave a Comment »

The Situation of Optimism

Posted by The Situationist Staff on March 12, 2012

From

Neuroscientist Tali Sharot visits the RSA to explain the biological bias of optimism, and its effect on our lives and societies.

Related Situationist posts:

Posted in Illusions, Life, Neuroscience, Positive Psychology, Social Psychology, Video | Tagged: , , , | 2 Comments »

The Situation of Competition

Posted by The Situationist Staff on February 12, 2012

From the University of Illinois News Bureau:

Researchers have found a way to study how our brains assess the behavior – and likely future actions – of others during competitive social interactions. Their study, described in a paper in the Proceedings of the National Academy of Sciences, is the first to use a computational approach to tease out differing patterns of brain activity during these interactions, the researchers report.

“When players compete against each other in a game, they try to make a mental model of the other person’s intentions, what they’re going to do and how they’re going to play, so they can play strategically against them,” said University of Illinois postdoctoral researcher Kyle Mathewson, who conducted the study as a doctoral student in the Beckman Institute with graduate student Lusha Zhu and economics professor and Beckman affiliate Ming Hsu, who now is at the University of California, Berkeley. “We were interested in how this process happens in the brain.”

Previous studies have tended to consider only how one learns from the consequences of one’s own actions, called reinforcement learning, Mathewson said. These studies have found heightened activity in the basal ganglia, a set of brain structures known to be involved in the control of muscle movements, goals and learning. Many of these structures signal via the neurotransmitter dopamine.

“That’s been pretty well studied and it’s been figured out that dopamine seems to carry the signal for learning about the outcome of our own actions,” Mathewson said. “But how we learn from the actions of other people wasn’t very well characterized.”

Researchers call this type of learning “belief learning.”

To better understand how the brain processes information in a competitive setting, the researchers used functional magnetic resonance imaging (fMRI) to track activity in the brains of participants while they played a competitive game, called a Patent Race, against other players. The goal of the game was to invest more than one’s opponent in each round to win a prize (a patent worth considerably more than the amount wagered), while minimizing one’s own losses (the amount wagered in each trial was lost). The fMRI tracked activity at the moment the player learned the outcome of the trial and how much his or her opponent had wagered.

A computational model evaluated the players’ strategies and the outcomes of the trials to map the brain regions involved in each type of learning.

“Both types of learning were tracked by activity in the ventral striatum, which is part of the basal ganglia,” Mathewson said. “That’s traditionally known to be involved in reinforcement learning, so we were a little bit surprised to see that belief learning also was represented in that area.”

Belief learning also spurred activity in the rostral anterior cingulate, a structure deep in the front of the brain. This region is known to be involved in error processing, regret and “learning with a more social and emotional flavor,” Mathewson said.

The findings offer new insight into the workings of the brain as it is engaged in strategic thinking, Hsu said, and may aid the understanding of neuropsychiatric illnesses that undermine those processes.

“There are a number of mental disorders that affect the brain circuits implicated in our study,” Hsu said. “These include schizophrenia, depression and Parkinson’s disease. They all affect these dopaminergic regions in the frontal and striatal brain areas. So to the degree that we can better understand these ubiquitous social functions in strategic settings, it may help us understand how to characterize and, eventually, treat the social deficits that are symptoms of these diseases.”

More.

The paper, “Dissociable Neural Representations of Reinforcement and Belief Prediction Errors Underlie Strategic Learning,” is available online or from the U. of I. News Bureau.

Related Situationist posts:

Posted in Abstracts, Altruism, Conflict, Neuroscience, Uncategorized | Tagged: , | Leave a Comment »

The Situation of Choice

Posted by The Situationist Staff on January 23, 2012

From the APS Monitor (excerpts from a terrific primer on “The Mechanics of Choice”):

* * *

The prediction of social behavior significantly involves the way people make decisions about resources and wealth, so the science of decision making historically was the province of economists. And the basic assumption of economists was always that, when it comes to money, people are essentially rational. It was largely inconceivable that people would make decisions that go against their own interests. Although successive refinements of expected-utility theory made room for individual differences in how probabilities were estimated, the on-the-surface irrational economic behavior of groups and individuals could always be forced to fit some rigid, rational calculation.The problem is — and everything from fluctuations in the stock market to decisions between saving for retirement or purchasing a lottery ticket or a shirt on the sale rack shows it — people just aren’t rational. They systematically make choices that go against what an economist would predict or advocate.Enter a pair of psychological scientists — Daniel Kahneman (currently a professor emeritus at Princeton) and Amos Tversky — who in the 1970s turned the economists’ rational theories on their heads. Kahneman and Tversky’s research on heuristics and biases and their Nobel Prize winning contribution, prospect theory, poured real, irrational, only-human behavior into the calculations, enabling much more powerful prediction of how individuals really choose between risky options.

* * *

Univ. of Toronto psychologist Keith E. Stanovich and James Madison Univ. psychologist Richard F. West refer to these experiential and analytical modes as “System 1” and “System 2,” respectively. Both systems may be involved in making any particular choice — the second system may monitor the quality of the snap, System-1 judgment and adjust a decision accordingly.7 But System 1 will win out when the decider is under time pressure or when his or her System-2 processes are already taxed.

This is not to entirely disparage System-1 thinking, however. Rules of thumb are handy, after all, and for experts in high-stakes domains, it may be the quicker form of risk processing that leads to better real-world choices. In a study by Cornell University psychologist Valerie Reyna and Mayo Clinic physician Farrell J. Lloyd, expert cardiologists took less relevant information into account than younger doctors and medical students did when making decisions to admit or not admit patients with chest pain to the hospital. Experts also tended to process that information in an all-or-none fashion (a patient was either at risk of a heart attack or not) rather than expending time and effort dealing with shades of gray. In other words, the more expertise a doctor has, the more that his or her intuitive sense of the gist of a situation was used as a guide.8

In Reyna’s variant of the dual-system account, fuzzy-trace theory, the quick-decision system focuses on the gist or overall meaning of a problem instead of rationally deliberating on facts and odds of alternative outcomes.9 Because it relies on the late-developing ventromedial and dorsolateral parts of the frontal lobe, this intuitive (but informed) system is the more mature of the two systems used to make decisions involving risks.

A 2004 study by Vassar biopsychologist Abigail A. Baird and Univ. of Waterloo cognitive psychologist Jonathan A. Fugelsang showed that this gist-based system matures later than do other systems. People of different ages were asked to respond quickly to easy, risk-related questions such as “Is it a good idea to set your hair on fire?”, “Is it a good idea to drink Drano?”, and “Is it a good idea to swim with sharks?” They found that young people took about a sixth of a second longer than adults to arrive at the obvious answers (it’s “no” in all three cases, in case you were having trouble deciding).10 The fact that our gist-processing centers don’t fully mature until the 20s in most people may help explain the poor, risky choices younger, less experienced decision makers commonly make.

Adolescents decide to drive fast, have unprotected sex, use drugs, drink, or smoke not simply on impulse but also because their young brains get bogged down in calculating odds. Youth are bombarded by warning statistics intended to set them straight, yet risks of undesirable outcomes from risky activities remain objectively small — smaller than teens may have initially estimated, even — and this may actually encourage young people to take those risks rather than avoid them. Adults, in contrast, make their choices more like expert doctors: going with their guts and making an immediate black/white judgment. They just say no to risky activities because, however objectively unlikely the risks are, there’s too much at stake to warrant even considering them.11

Making Better Choices

The gist of the matter is, though, that none of us, no matter how grown up our frontal lobes, make optimal decisions; if we did, the world would be a better place. So the future of decision science is to take what we’ve learned about heuristics, biases, and System-1 versus System-2 thinking and apply it to the problem of actually improving people’s real-world choices.

One obvious approach is to get people to increase their use of System 2 to temper their emotional, snap judgments. Giving people more time to make decisions and reducing taxing demands on deliberative processing are obvious ways of bringing System 2 more into the act. Katherine L. Milkman (U. Penn.), Dolly Chugh (NYU), and Max H. Bazerman (Harvard) identify several other ways of facilitating System-2 thinking.12 One example is encouraging decision makers to replace their intuitions with formal analysis — taking into account data on all known variables, providing weights to variables, and quantifying the different choices. This method has been shown to significantly improve decisions in contexts like school admissions and hiring.

Having decision makers take an outsider’s perspective on a decision can reduce overconfidence in their knowledge, in their odds of success, and in their time to complete tasks. Encouraging decision makers to consider the opposite of their preferred choice can reduce judgment errors and biases, as can training them in statistical reasoning. Considering multiple options simultaneously rather than separately can optimize outcomes and increase an individual’s willpower in carrying out a choice. Analogical reasoning can reduce System-1 errors by highlighting how a particular task shares underlying principles with another unrelated one, thereby helping people to see past distracting surface details to more fully understand a problem. And decision making by committee rather than individually can improve decisions in group contexts, as can making individuals more accountable for their decisions.13

In some domains, however, a better approach may be to work with, rather than against, our tendency to make decisions based on visceral reactions. In the health arena, this may involve appealing to people’s gist-based thinking. Doctors and the media bombard health consumers with numerical facts and data, yet according to Reyna, patients — like teenagers — tend initially to overestimate their risks; when they learn their risk for a particular disease is actually objectively lower than they thought, they become more complacent — for instance by forgoing screening. Instead, communicating the gist, “You’re at (some) risk, you should get screened because it detects disease early” may be a more powerful motivator to make the right decision than the raw numbers. And when statistics are presented, doing so in easy-to-grasp graphic formats rather than numerically can help patients (as well as physicians, who can be as statistically challenged as most laypeople) extract their own gists from the facts.14

Complacency is a problem when decisions involve issues that feel more remote from our daily lives — problems like global warming. The biggest obstacle to changing people’s individual behavior and collectively changing environmental policy, according to Columbia University decision scientist Elke Weber, is that people just aren’t scared of climate change. Being bombarded by facts and data about perils to come is not the same as having it affect us directly and immediately; in the absence of direct personal experience, our visceral decision system does not kick in to spur us to make better environmental choices such as buying more fuel-efficient vehicles.15

How should scientists and policymakers make climate change more immediate to people? Partly, it involves shifting from facts and data to experiential button-pressing. Powerful images of global warming and its effects can help. Unfortunately, according to research conducted by Yale environmental scientist Anthony A. Leisurowitz, the dominant images of global warming in Americans’ current consciousness are of melting ice and effects on nonhuman nature, not consequences that hit closer to home; as a result, people still think of global warming as only a moderate concern.16

Reframing options in terms that connect tangibly with people’s more immediate priorities, such as the social rules and norms they want to follow, is a way to encourage environmentally sound choices even in the absence of fear.17 For example, a study by Noah J. Goldstein (Univ. of Chicago), Robert B. Cialdini (Arizona State), and Vladas Griskevicius (Univ. of Minnesota) compared the effectiveness of different types of messages in getting hotel guests to reuse their towels rather than send them to the laundry. Messages framed in terms of social norms — “the majority of guests in this room reuse their towels” — were more effective than messages simply emphasizing the environmental benefits of reuse.18

Yet another approach to getting us to make the most beneficial decisions is to appeal to our natural laziness. If there is a default option, most people will accept it because it is easiest to do so — and because they may assume that the default is the best. University of Chicago economist Richard H. Thaler suggests using policy changes to shift default choices in areas like retirement planning. Because it is expressed as normal, most people begin claiming their Social Security benefits as soon as they are eligible, in their early to mid 60s — a symbolic retirement age but not the age at which most people these days are actually retiring. Moving up the “normal” retirement age to 70 — a higher anchor — would encourage people to let their money grow longer untouched.19

* * *

Making Decisions About the Environment

APS Fellow Elke Weber recently had the opportunity to discuss her research with others who share her concern about climate change, including scientists, activists, and the Dalai Lama. Weber . . . shared her research on why people fail to act on environmental problems. According to her, both cognitive and emotional barriers prevent us from acting on environmental problems. Cognitively, for example, a person’s attention is naturally focused on the present to allow for their immediate survival in dangerous surroundings. This present-focused attitude can discourage someone from taking action on long-term challenges such as climate change. Similarly, emotions such as fear can motivate people to act, but fear is more effective for responding to immediate threats. In spite of these challenges, Weber said that there are ways to encourage people to change their behavior. Because people often fail to act when they feel powerless, it’s important to share good as well as bad environmental news and to set measurable goals for the public to pursue. Also, said Weber, simply portraying reduced consumption as a gain rather than a loss in pleasure could inspire people to act.

References and Further Reading:

  • 7. Stanovich, K.E., & West, R.F. (2000). Individual differences in reasoning: Implications for the rationality debate.
  • Behavioral & Brain Sciences, 23, 645–665.
  • 8. Reyna, V.F., & Lloyd, F. (2006). Physician decision making and cardiac risk: Effects of knowledge, risk perception, risk
  • tolerance, and fuzzy processing. Journal of Experimental Psychology: Applied, 12, 179–195.
  • 9. Reyna, V.F. (2004). How people make decisions that involve risk: A dual-processes approach. Current Directions in
  • Psychological Science, 13, 60–66.
  • 10. Baird, A.A., & Fugelsang, J.A. (2004). The emergence of consequential thought: Evidence from neuroscience.
  • Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences, 359, 1797–1804.
  • 11. Reyna, VF., & Farley, F. (2006). Risk and rationality in adolescent decision making. Psychological Science in the Public
  • Interest, 7, 1–44.
  • 12. Milkman, K.L., Chugh, D., & Bazerman, M.H. (2009). How can decision making be improved? Perspectives on
  • Psychological Science, 4, 379–383.
  • 13. Ibid.
  • 14. See Wargo, E. (2007). More than just the facts: Helping patients make informed choices. Cornell University Department
  • of Human Development: Outreach & Extension. Downloaded from http://www.human.cornell.edu/hd/outreach-extension/loader.cfm?csModule=security/getfile&PageID=43508
  • 15. Weber, E.U. (2006). Experience-based and description-based perceptions of long-term risk: Why global warming does
  • not scare us (yet). Climatic Change, 77, 103–120.
  • 16. Leisurowitz, A. (2006). Climate change risk perception and policy preferences: The role of affect, imagery, and values.
  • Climatic Change, 77, 45–72.
  • 17. Weber, E.U. (2010). What shapes perceptions of climate change? Wiley Interdisciplinary Reviews: Climate Change, 1,
  • 332–342.
  • 18. Goldstein, N.J., Cialdini, R.B., & Griskevicius, V. (2008). A room with a viewpoint: Using social norms to motivate
  • environmental conservation in hotels. Journal of Consumer Research, 35. Downloaded from http://www.csom.umn.edu/assets/118359.pdf
  • 19. Thaler, R.H. (2011, July 16). Getting the Most Out of Social Security. The New York Times. Downloaded from
  • http://www.nytimes.com/2011/07/17/business/economy/when-the-wait-for-social-security-checks-is-worth-it.html?_r=1&adxnnl=1&adxnnlx=1322835490-9f6qOJ9Sp2jSw4LKDjmYgw

More.

Related Situationist posts:

You can review hundreds of Situationist posts related to the topic of “choice myth” here.

Posted in Behavioral Economics, Choice Myth, History, Ideology, Neuroscience, Public Policy | Tagged: , , , | 1 Comment »

Marines Defiling Dead Taliban – Might Recent Neuroscience Shed Light?

Posted by The Situationist Staff on January 11, 2012

From The Daily Princetonian:

Failure in the part of the brain that controls social functions could explain why regular people might commit acts of ruthless violence, according to new study by a University research team.

A particular network in the brain is normally activated when we meet someone, empathize with him and think about his experiences.

However, MRI technology showed that when a person encounters someone he deems a drug addict, homeless person or anyone he finds repulsive, parts of this network may fail to activate, creating a pathway to “dehumanized perception” — a failure to acknowledge others’ thoughts and experiences.

According to the study, this process of dehumanizing victims could explain how propoganda portraying Jews as vermin in Nazi Germany and Tutsis of Rwanda as cockroaches led to genocide.

“We all dehumanize other people to some extent,” psychology professor [and Situationist Contributor] Susan Fiske said in an email, noting that it is impossible to delve into the mind of every person we pass.

“That being said, we have shown that people can rehumanize a group they might normally ignore, just by thinking about their preferences, as when a soup kitchen worker thinks about a homeless person’s food preferences.”

Earlier work from the team dealt with social cognition or how individuals perceive the thoughts of others with a study that had individuals think about a day in the life of another person.

The new research attempted to build upon this idea further to include the network in the brain charged with disgust, attention and cognitive control.

To collect their data, the scientists had 119 undergraduates at the University complete judgment and decision-making surveys as they looked at images of individuals such as a firefighter, female college student, elderly man, disabled woman, homeless woman and male drug addict.

This exercise sought to study how the network in the brain involved in social cognition reacted to common emotions shared by participants about the people in the images.

The researchers found that parts of the network in the brain did not activate when participants viewed the images of drug addicts, homeless people and immigrants.

“We all have the capacity to engage in dehumanized perception; it’s not just reserved for serial killers,” Harris said in an email. “There are many routes to dehumanization, and different people may use different routes.”

One such route, according to Harris, may be to avoid thinking about the suffering of others — people who dehumanize homeless people may do this.

Another route could be to view someone as a means to an end. Sports fans may engage in this when they think about trading a favorite player to another team.

Fiske and Harris plan to replicate the study on imprisoned psychopaths, and are continuing to explore the different routes to dehumanized perception.

For a collection of related Situationist posts, see The Interior Situation of Atrocities.

Posted in Conflict, Neuroeconomics, Situationist Contributors | Tagged: , , , | Leave a Comment »

The Interior Situation of Atrocities

Posted by The Situationist Staff on January 10, 2012

From People’s World (an article summarizing recent research by Situationist Contributor Susan Fiske):

Why do people commit atrocities? What is responsible for brutality and the cold blooded murder of innocents carried out by Nazis, the Hutu in Rwanda, or the United States against the Vietnamese people and more recently much of the civilian population of Iraq? Some scientists believe they have found the answer.

ScienceDaily reports (“Brain’s Failure to Appreciate Others May Permit Human Atrocities,” 12-14-2011) that the part of the brain responsible for social interaction with others may malfunction resulting in callousness leading to inhumane actions towards others. Scientists at Duke and Princeton have hypothesized, in a recent study, that this brain area can “disengage” when people encounter others they think are “disgusting” and the resulting violence perpetrated against them is due to thinking these objectified others have no “thoughts and feelings.”

The study, according to ScienceDaily, considers this a “shortcoming” which could account for the genocide and torture of other peoples. Examples of this kind of objectification can be seen in the calling of Jews “vermin” by the Nazis, the Tutsi “cockroaches” by the Hutu, and the American habit of calling others “gooks” (as well as other unflattering terms).

Lasana Harris (Duke) says, “When we encounter a person, we usually infer something about their minds [do they have more than one?] Sometimes, we fail to do this, opening up the possibility that we do not perceive the person as fully human.” I wonder about this? What is meant by fully human? Surely the Hutu, for example, who had lived with the Tutsi for centuries, did not really fail to infer that they had “minds.”

Practicing something called “social neuroscience” which seems to consist of showing different people pictures while they are undergoing an MRI and then drawing conclusions from which areas of the brain do or do not “light up” when asked questions about these pictures, the scientists conducting this study discovered that an area of the brain dealing with “social cognition”– i.e., feelings, thoughts, empathy, etc., “failed to engage” when pictures of homeless people, drug addicts, and others “low on the social ladder” were shown.

Susan Fiske (Princeton) remarked, “We need to think about other people’s experience. It’s what makes them fully human to us.” ScienceDaily adds the researchers were struck by the fact that “people will easily ascribe social cognition– a belief in an internal life such as emotions– to animals and cars, but will avoid making eye contact with the homeless panhandler in the subway.”

More.

Related Situationist posts:

Image from Flickr.

Posted in Altruism, Conflict, Ideology, Implicit Associations, Neuroscience, Social Psychology | Tagged: , , | Leave a Comment »

Mapping the Brain

Posted by The Situationist Staff on November 17, 2011

From Ted Talks:

How can we begin to understand the way the brain works? The same way we begin to understand a city: by making a map. In this visually stunning talk, Allan Jones shows how his team is mapping which genes are turned on in each tiny region, and how it all connects up.

Posted in Neuroscience, Video | Tagged: | Leave a Comment »

The Situation of Michael S. Gazzaniga

Posted by The Situationist Staff on November 15, 2011

From The New York Times, a terrific article about Michael Gazzaniga:

The scientists exchanged one last look and held their breath.

Everything was ready. The electrode was in place, threaded between the two hemispheres of a living cat’s brain; the instruments were tuned to pick up the chatter passing from one half to the other. The only thing left was to listen for that electronic whisper, the brain’s own internal code.

The amplifier hissed — the three scientists expectantly leaning closer — and out it came, loud and clear.

“We all live in a yellow submarine, yellow submarine, yellow submarine ….”

“The Beatles’ song! We somehow picked up the frequency of a radio station,” recalled Michael S. Gazzaniga, chuckling at the 45-year-old memory. “The brain’s secret code. Yeah, right!”

Dr. Gazzaniga, 71, now a professor of psychology at the University of California, Santa Barbara, is best known for a dazzling series of studies that revealed the brain’s split personality, the division of labor between its left and right hemispheres. But he is perhaps next best known for telling stories, many of them about blown experiments, dumb questions and other blunders during his nearly half-century career at the top of his field.

Now, in lectures and a new book, he is spelling out another kind of cautionary tale — a serious one, about the uses of neuroscience in society, particularly in the courtroom.

Brain science “will eventually begin to influence how the public views justice and responsibility,” Dr. Gazzaniga said at a recent conference here sponsored by the Edge Foundation.

And there is no guarantee, he added, that its influence will be a good one.

For one thing, brain-scanning technology is not ready for prime time in the legal system; it provides less information than people presume.

For another, new knowledge about neural processes is raising important questions about human responsibility. Scientists now know that the brain runs largely on autopilot; it acts first and asks questions later, often explaining behavior after the fact. So if much of behavior is automatic, then how responsible are people for their actions?

Who’s driving this submarine, anyway?

In his new book, “Who’s in Charge? Free Will and the Science of the Brain,” being published this month by Ecco/HarperCollins, Dr. Gazzaniga (pronounced ga-ZAHN-a-ga) argues that the answer is hidden in plain sight. It’s a matter of knowing where to look.

* * *

He began thinking seriously about the nature of responsibility only after many years of goofing off.

Mike Gazzaniga grew up in Glendale, Calif., exploring the open country east of Los Angeles and running occasional experiments in his garage, often with the help of his father, a prominent surgeon. It was fun; the experiments were real attempts to understand biochemistry; and even after joining the Alpha Delta Phi fraternity at Dartmouth (inspiration for the movie “Animal House”), he made time between parties and pranks to track who was doing what in his chosen field, brain science.

In particular, he began to follow studies at the California Institute of Technology suggesting that in animals, developing nerve cells are coded to congregate in specific areas in the brain. This work was captivating for two reasons.

First, it seemed to contradict common wisdom at the time, which held that specific brain functions like memory were widely — and uniformly — distributed in the brain, not concentrated in discrete regions.

Second, his girlfriend was due to take a summer job right there near Caltech.

He decided to write a letter to the director of the program, the eminent neurobiologist Roger Wolcott Sperry (emphasizing reason No. 1). Could Dr. Sperry use a summer intern? “He said sure,” Dr. Gazzaniga said. “I always tell students, ‘Go ahead and write directly to the person you want to study with; you just never know.’ ”

At Caltech that summer after his junior year, he glimpsed his future. He learned about so-called split-brain patients, people with severe epilepsy who had surgery cutting the connections between their left and right hemispheres. The surgery drastically reduced seizures but seemed to leave people otherwise unaffected.

Read the article here.

Related Situationist posts:

Mike Gazzaniga on the Split Brain

Posted in Classic Experiments, Neuroscience, Video | Tagged: , | Leave a Comment »

Mike Gazzaniga on the Split Brain

Posted by The Situationist Staff on October 22, 2011

* * *

Related Situationist posts:

Posted in Neuroscience, Video | Tagged: , , | 1 Comment »

Steven Hyman on Neuroethics

Posted by The Situationist Staff on October 16, 2011

Vodpod videos no longer available.

From The Science Network:

Steven Hyman is Professor of Neurobiology at Harvard Medical School. Hyman is a former Provost of Harvard University and Director of the National Institute of Mental Health. He is also a member of the Institute of Medicine of the National Academy of Sciences and of the American Academy of Arts and Sciences. Hyman also serves as Editor of the Annual Review of Neuroscience.

Posted in Neuroscience, Video | Tagged: , , | Leave a Comment »

The Neuro-Situation of Wins and Losses

Posted by The Situationist Staff on October 10, 2011

From Montreal Gazette:

A new National Hockey League season is upon us, Major League Baseball playoffs are in full swing and the National Football League’s regular season has been in session for about a month.

As you fixate on your television, watching every move of your favourite athletes and longing for that great play or crucial win that can serve up a rush that can approach orgasm, consider this: New research from Yale University shows even more of your brain than previously thought physically reacts to something perceived as a win or a loss.

A new study, published in the journal Neuron, outlines experiments showing how most of the brain has heightened activity if one wins or loses a competition such as rock-paper-scissors.

It was a broader effect than what was known before to be a reaction of the central part of the brain in releasing dopamine when something good happens, creating a positive feeling in an individual. Conversely, past evidence has also shown this neurotransmitter is suppressed when an unwanted outcome occurs.

The study’s lead author, Timothy Vickery, a post-doctoral fellow at Yale’s psychology department, said it’s possible that the brain has a similar kind of engagement when its owner is watching sports.

“We didn’t look at that directly in this study, but it wouldn’t be very surprising to me if those sorts of second-hand experiences had the same influence, because you’re sort of identifying with your team, and a win for your team is a win for you,” he said.

Vickery said the high engagement sports fans feel when watching a competition likely comes from the previously known function of the basal ganglia, in the middle of the brain, sending out dopamine when a positive outcome is perceived.

It has its roots, he said, in evolutionary tendencies that favour people and animals that are able to make the right choices to improve chances for survival and create results — such as finding food — that induce dopamine-fuelled feelings of joy.

Vickery said the effect can be vicarious when watching other people participate in sports.

“I think it’s fair to say that, to the extent that you experience those wins and losses as your own, it would have a similar effect on your brain as taking your own actions,” he said.

By conducting MRIs on people while they competed against a computer in games such as rock-paper-scissors, the Yale study found that most parts of subjects’ brains, even beyond the basal ganglia, had physical reactions to both wins and losses.

By analyzing the brain as a whole, Vickery said the researchers could determine whether the individual was experiencing a win or a loss, based on subtle differences in the nature of the patterns. He said it is likely this broadly based brain reaction is somehow related to established theories concerning the reward-punishment function at the brain’s centre. The study, however, could not conclude that.

“My suspicion is that it’s not unrelated, that basically that signal gets sent out from the basal ganglia . . . and sort of filters out through the brain, but we don’t know for sure where it’s coming from. There’s still a lot of work to be done.”

More.

Related Situationist posts:

Posted in Abstracts, Neuroscience, Situationist Sports | Tagged: , , | Leave a Comment »

Brain and Blame

Posted by The Situationist Staff on August 11, 2011

From The Atlantic (by David Eagleman):

On the steamy first day of August 1966, Charles Whitman took an elevator to the top floor of the University of Texas Tower in Austin. The 25-year-old climbed the stairs to the observation deck, lugging with him a footlocker full of guns and ammunition. At the top, he killed a receptionist with the butt of his rifle. Two families of tourists came up the stairwell; he shot at them at point-blank range. Then he began to fire indiscriminately from the deck at people below. The first woman he shot was pregnant. As her boyfriend knelt to help her, Whitman shot him as well. He shot pedestrians in the street and an ambulance driver who came to rescue them.

The evening before, Whitman had sat at his typewriter and composed a suicide note:

I don’t really understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I can’t recall when it started) I have been a victim of many unusual and irrational thoughts.

By the time the police shot him dead, Whitman had killed 13 people and wounded 32 more. The story of his rampage dominated national headlines the next day. And when police went to investigate his home for clues, the story became even stranger: in the early hours of the morning on the day of the shooting, he had murdered his mother and stabbed his wife to death in her sleep.

It was after much thought that I decided to kill my wife, Kathy, tonight … I love her dearly, and she has been as fine a wife to me as any man could ever hope to have. I cannot rationa[l]ly pinpoint any specific reason for doing this …

Along with the shock of the murders lay another, more hidden, surprise: the juxtaposition of his aberrant actions with his unremarkable personal life. Whitman was an Eagle Scout and a former marine, studied architectural engineering at the University of Texas, and briefly worked as a bank teller and volunteered as a scoutmaster for Austin’s Boy Scout Troop 5. As a child, he’d scored 138 on the Stanford-Binet IQ test, placing in the 99th percentile. So after his shooting spree from the University of Texas Tower, everyone wanted answers.

For that matter, so did Whitman. He requested in his suicide note that an autopsy be performed to determine if something had changed in his brain—because he suspected it had.

I talked with a Doctor once for about two hours and tried to convey to him my fears that I felt [overcome by] overwhelming violent impulses. After one session I never saw the Doctor again, and since then I have been fighting my mental turmoil alone, and seemingly to no avail.

Whitman’s body was taken to the morgue, his skull was put under the bone saw, and the medical examiner lifted the brain from its vault. He discovered that Whitman’s brain harbored a tumor the diameter of a nickel. This tumor, called a glioblastoma, had blossomed from beneath a structure called the thalamus, impinged on the hypothalamus, and compressed a third region called the amygdala. The amygdala is involved in emotional regulation, especially of fear and aggression. By the late 1800s, researchers had discovered that damage to the amygdala caused emotional and social disturbances. In the 1930s, the researchers Heinrich Klüver and Paul Bucy demonstrated that damage to the amygdala in monkeys led to a constellation of symptoms, including lack of fear, blunting of emotion, and overreaction. Female monkeys with amygdala damage often neglected or physically abused their infants. In humans, activity in the amygdala increases when people are shown threatening faces, are put into frightening situations, or experience social phobias. Whitman’s intuition about himself—that something in his brain was changing his behavior—was spot-on.

Stories like Whitman’s are not uncommon: legal cases involving brain damage crop up increasingly often. As we develop better technologies for probing the brain, we detect more problems, and link them more easily to aberrant behavior. Take the 2000 case of a 40-year-old man we’ll call Alex, whose sexual preferences suddenly began to transform. He developed an interest in child pornography—and not just a little interest, but an overwhelming one. He poured his time into child-pornography Web sites and magazines. He also solicited prostitution at a massage parlor, something he said he had never previously done. He reported later that he’d wanted to stop, but “the pleasure principle overrode” his restraint. He worked to hide his acts, but subtle sexual advances toward his prepubescent stepdaughter alarmed his wife, who soon discovered his collection of child pornography. He was removed from his house, found guilty of child molestation, and sentenced to rehabilitation in lieu of prison. In the rehabilitation program, he made inappropriate sexual advances toward the staff and other clients, and was expelled and routed toward prison.

At the same time, Alex was complaining of worsening headaches. The night before he was to report for prison sentencing, he couldn’t stand the pain anymore, and took himself to the emergency room. He underwent a brain scan, which revealed a massive tumor in his orbitofrontal cortex. Neurosurgeons removed the tumor. Alex’s sexual appetite returned to normal.

The year after the brain surgery, his pedophilic behavior began to return. The neuroradiologist discovered that a portion of the tumor had been missed in the surgery and was regrowing—and Alex went back under the knife. After the removal of the remaining tumor, his behavior again returned to normal.

When your biology changes, so can your decision-making and your desires. The drives you take for granted (“I’m a heterosexual/homosexual,” “I’m attracted to children/adults,” “I’m aggressive/not aggressive,” and so on) depend on the intricate details of your neural machinery. Although acting on such drives is popularly thought to be a free choice, the most cursory examination of the evidence demonstrates the limits of that assumption.

Alex’s sudden pedophilia illustrates that hidden drives and desires can lurk undetected behind the neural machinery of socialization. When the frontal lobes are compromised, people become disinhibited, and startling behaviors can emerge. Disinhibition is commonly seen in patients with frontotemporal dementia, a tragic disease in which the frontal and temporal lobes degenerate. With the loss of that brain tissue, patients lose the ability to control their hidden impulses. To the frustration of their loved ones, these patients violate social norms in endless ways: shoplifting in front of store managers, removing their clothes in public, running stop signs, breaking out in song at inappropriate times, eating food scraps found in public trash cans, being physically aggressive or sexually transgressive. Patients with frontotemporal dementia commonly end up in courtrooms, where their lawyers, doctors, and embarrassed adult children must explain to the judge that the violation was not the perpetrator’s fault, exactly: much of the brain has degenerated, and medicine offers no remedy. Fifty-seven percent of frontotemporal-dementia patients violate social norms, as compared with only 27 percent of Alzheimer’s patients.

Changes in the balance of brain chemistry, even small ones, can also cause large and unexpected changes in behavior. Victims of Parkinson’s disease offer an example. In 2001, families and caretakers of Parkinson’s patients began to notice something strange. When patients were given a drug called pramipexole, some of them turned into gamblers. And not just casual gamblers, but pathological gamblers. These were people who had never gambled much before, and now they were flying off to Vegas. One 68-year-old man amassed losses of more than $200,000 in six months at a series of casinos. Some patients became consumed with Internet poker, racking up unpayable credit-card bills. For several, the new addiction reached beyond gambling, to compulsive eating, excessive alcohol consumption, and hypersexuality.

What was going on? Parkinson’s involves the loss of brain cells that produce a neurotransmitter known as dopamine. Pramipexole works by impersonating dopamine. But it turns out that dopamine is a chemical doing double duty in the brain. Along with its role in motor commands, it also mediates the reward systems, guiding a person toward food, drink, mates, and other things useful for survival. Because of dopamine’s role in weighing the costs and benefits of decisions, imbalances in its levels can trigger gambling, overeating, and drug addiction—behaviors that result from a reward system gone awry. Physicians now watch for these behavioral changes as a possible side effect of drugs like pramipexole. Luckily, the negative effects of the drug are reversible—the physician simply lowers the dosage, and the compulsive gambling goes away.

The lesson from all these stories is the same: human behavior cannot be separated from human biology. If we like to believe that people make free choices about their behavior (as in, “I don’t gamble, because I’m strong-willed”), cases like Alex the pedophile, the frontotemporal shoplifters, and the gambling Parkinson’s patients may encourage us to examine our views more carefully. Perhaps not everyone is equally “free” to make socially appropriate choices.

Does the discovery of Charles Whitman’s brain tumor modify your feelings about the senseless murders he committed? Does it affect the sentence you would find appropriate for him, had he survived that day? Does the tumor change the degree to which you consider the killings “his fault”? Couldn’t you just as easily be unlucky enough to develop a tumor and lose control of your behavior?

On the other hand, wouldn’t it be dangerous to conclude that people with a tumor are free of guilt, and that they should be let off the hook for their crimes?

More.

Related Situationist Posts:

Posted in Emotions, Law, Morality, Neuroscience | Tagged: , , , , | Leave a Comment »

David Eagleman on the Brain and the Law

Posted by The Situationist Staff on May 31, 2011

From :

Dr David Eagleman considers some questions relating to law and neuroscience, challenging long-held assumptions in criminality and punishment and predicting a radical new future for the legal system.

[Eagleman’s examples in the first 15 minutes will  strike long-term readers of The Situationist as non-novel.  For others, that portion of the video may be a useful primer to neurolaw.]

Related Situationist Posts:

 

 

Posted in Implicit Associations, Law, Neuroscience, Video | Tagged: , , , | 1 Comment »

The Neuro-Situation of Shopping Choices

Posted by The Situationist Staff on May 16, 2011

From ScienceDaily:

Researchers at Oxford University are to study ‘neuromarketing’, a relatively new field of consumer and market research, which uses brain imaging and measurement technology to study the neural processes underlying an individual’s choice.

Neuromarketing claims to reveal how consumers assess, deliberate and choose in a variety of contexts.

According to neuromarketers this growing industry has the potential to significantly increase the effectiveness of advertising and marketing campaigns. They claim that neuromarketing will provide detailed knowledge about customer preferences and what marketing activities will stimulate buying behaviour, and make promotional campaigns more effective. It will be valuable in providing cues for the best place and prices in advertisements, and should cut the risk of marketing products that are doomed to fail. In the experts’ view, instead of relying on focus groups, neuromarketing offers the promise of ‘objective neurological evidence’ to inform organisations’ marketing campaigns.

But if neuromarketing is set to revolutionise marketing, what are the implications of this development? The study will cast light on the ‘neuro-turn’ in marketing by conducting fieldwork, interviews and documentary analysis. In addition a critical, historical assessment will consider and compare how different market research techniques can affect consumers and consumer behaviour.

The project is led by Professor Steve Woolgar, of the Saïd Business School, and is located within a larger collaborative study of the “Neuro-turn in European Social Sciences and the Humanities: Impacts of neurosciences on economics, marketing and philosophy” (acronym: NESSHI) with researchers from other parts of Europe.

Professor Woolgar said: ‘This three-year project will be the first large-scale study of how emerging neurological knowledge about human decision-making is transforming the techniques of marketers and others who seek to influence the behaviour of consumers. It has far reaching implications for what we know about how humans make their choices, the role of the brain and the factors at play in everyday decisions we all take.’

Dr Tanja Schneider, researcher on the project, said: ‘For a number of years, research has been focussed on brain imaging centres. This is now moving out of the laboratory and into practice. The research we are doing will cast light on what is already happening in this area, and will explore what is likely to develop in the near future. We know this will impact society in a major way, so it is critical to understand these developments better’.

More.

Related Situationist posts.

Posted in Marketing, Neuroscience | Tagged: , , | 1 Comment »

Joseph LeDoux on the Neural Situation of Emotion and Memory

Posted by The Situationist Staff on October 19, 2010

Joseph LeDoux is a professor and a member of the Center for Neural Science and Department of Psychology at NYU. His work is focused on the brain mechanisms of emotion and memory. In addition to articles in scholarly journals, he is author of “The Emotional Brain: The Mysterious Underpinnings of Emotional Life” and “Synaptic Self: How Our Brains Become Who We Are.” He is a fellow of the American Association for the Advancement of Science, a fellow of the New York Academy of Science, a fellow of the American Academy of Arts and Science, and the recipient of the 2005 Fyssen International Prize in Cognitive Science. LeDoux is also a singer and songwriter in the rock band, The Amygdaloids.

* * *

For a sample of related Situationist posts, see “The Situation of Neuroeconomics and Situationist Economics,” “The Interior Situation of Complex Human Feelings,” “The Situation of Memory,” “Accidentally Us,” “The Affective Situation of Ethics and Mediation,” and Situating Emotion.”

Posted in Emotions, Neuroscience, Video | Tagged: , , , , | Leave a Comment »

Rebecca Saxe on Situationism

Posted by The Situationist Staff on June 10, 2010

From the National Science Foundation:

Rebecca Saxe (Carole Middleton Career Development Professor in the department of brain and cognitive sciences at MIT) discusses the under-appreciated power of situation.

* * *

Vodpod videos no longer available.

* * *

For a sample of related Situationist posts, see “Zimbardo on Milgram and Obedience – Part II,” “Jon Hanson on Situationism and Dispositionism,” Hanson’s Chair Lecture on Situationism,” “‘Situation’ Trumps ‘Disposition’ – Part I,” and ““Situation” Trumps “Disposition”- Part II.”

Posted in Classic Experiments, Illusions, Neuroscience, Video | Tagged: , , , | 1 Comment »

The Situational Consequences of Poverty on Brains

Posted by The Situationist Staff on June 9, 2010

Anne McIlroy wrote a piece for the Toronto Globe and Mail describing research by Dr. James Swain, who is using brain imaging techniques to study the effects of poverty on the brain.  Here are some excerpts.

* * *

Over the past four decades, researchers have established how poverty shapes lives, that low socioeconomic status is associated with poor academic performance, poor mental and physical health and other negative outcomes. Swain is part of a new generation of neuroscientists investigating how poverty shapes the brain.

The University of Michigan researcher will use imaging technologies to compare the structure and function of brains of young adults from families with low socioeconomic status to those who are middle-class.

* * *

He and other neuroscientists are building on preliminary evidence that suggests the chronic stress of living in an impoverished household, among other factors, can have an impact on the developing brain.

Studies suggest low socioeconomic status may affect several areas of the brain, including the circuitry involved in language, memory and in executive functions, a set of skills that help us focus on a problem and solve it.

* * *

At Michigan, Swain will be looking at many different parts of the brain and the connections between regions.

His volunteers are 52 young adults that one of his colleagues, Gary Evans at Cornell University, has been tracking since they were in their mothers’ wombs. Half of them grew up in poverty, the other half in working or middle-class homes.

As early as next month, Swain will begin two days of brain imaging and tests for each volunteer. He will assess language skills and memory and study how their brains react to pictures of scary faces, and whether that reaction changes when they are stressed. (He’ll stress them by asking them to do mental arithmetic in front of strangers.)

* * *

You can read the entire article here.   For a sample of related Situationist posts, see “Inequality and the Unequal Situation of Mental and Physical Health,” The Interior Situation of Intergenerational Poverty,” Rich Brains, Poor Brains?,” Jeffrey Sachs on the Situation of Global Poverty,” “The Situation of Financial Risk-Taking,” “The Situation of Standardized Test Scores,”The Toll of Discrimination on Black Women,” The Physical Pains of Discrimination,” The Depressing Effects of Racial Discrimination,” and The Cognitive Costs of Interracial Interactions.”

Posted in Distribution, Education, Environment, Neuroscience | Tagged: , , | 1 Comment »

The Neuro-Situation of Violence and Empathy

Posted by The Situationist Staff on April 11, 2010

From EurekaAlert:

“Just as our species could be considered the most violent, since we are capable of serial killings, genocide and other atrocities, we are also the most empathetic species, which would seem to be the other side of the coin”, Luis Moya Albiol, lead author of the study and a researcher at the UV, tells SINC.

This study, published in the most recent issue of the Revista de Neurología, concludes that the prefrontal and temporal cortex, the amygdala and other features of the limbic system (such as insular and cingular cortexcortex) play “a fundamental role in all situations in which empathy appears”.

Moya Albiol says these parts of the brain overlap “in a surprising way” with those that regulate aggression and violence. As a result, the scientific team argues that the cerebral circuits – for both empathy and violence – could be “partially similar”.

“We all know that encouraging empathy has an inhibiting effect on violence, but this may not only be a social question but also a biological one – stimulation of these neuronal circuits in one direction reduces their activity in the other”, the researcher adds.

This means it is difficult for a “more empathetic” brain to behave in a violent way, at least on a regular basis. “Educating people to be empathetic could be an education for peace, bringing about a reduction in conflict and belligerent acts”, the researcher concludes.

Techniques for measuring the human brain “in vivo”, such as functional magnetic resonance imaging, are making it possible to find out more about the structures of the brain that regulate behaviour and psychological processes such as empathy.

* * *

These findings were published in the latest issue of Revista de Neurología.

For a sample of related Situationist posts, see “The Situation of Morality and Empathy,” The Situation of Kindness,” The Situation of Caring,” New Study Looks at the Roots of Empathy,” “The Situational Effect of Groups,” The Situational Benefits of Outsiders,” Racism Meets Groupism and Teamism,” ‘Us’ and ‘Them,’” “Team-Interested Decision Making,” “Some (Interior) Situational Sources War – Part I,”The Case for Obedience,” and “March Madness.”

Posted in Abstracts, Conflict, Emotions | Tagged: , , | 2 Comments »

 
%d bloggers like this: