Skip to main content

When Morality Is Hard to Like

How do we juggle evidence and emotions to make a moral decision?

On August 2, 1939, as the specter of the second World War loomed, Albert Einstein wrote President Franklin D. Roosevelt a letter he knew could affect the war and the future of humanity. The subject was the possibility of developing nuclear weapons. “Certain aspects of this situation,” Einstein wrote,

... seem to call for watchfulness and, if necessary, quick action on the part of the Administration. I believe therefore that it is my duty to bring to your attention the following facts and recommendations....

Einstein’s letter encapsulates key aspects of moral judgment: moral sentiment (his concern about the outcome of World War II); recognition of a moral dilemma (whether to disclose scientific evidence that could lead to a fearsomely lethal new weapon) and a utilitarian calculus (Would more lives be spared if America rather than Germany eventually built such a weapon?). It must have been a terrible struggle deciding whether to write that letter.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Half a century later cognitive neuroscience is gaining the ability to explain the brain mechanisms that underlie such moral judgments and ethical deliberations. Empirical studies have examined issues such as how a sense of morality arises in a child’s developing brain, how various kinds of brain damage affect moral judgment, which brain areas seem to be at play when we feel moral disgust, and how we think our way through confusing moral dilemmas. The results have been compelling; carry out an Internet search for “brain and morality,” and you will get a taste of this rich and growing literature.

A crucial issue that remains poorly understood, however, is the relation between moral reasoning and emotion. How does emotion affect our judgment about what is moral? A study published last April in Nature offers important new insight into this question. Michael Koenigs, now a postdoctoral fellow at the National Institute of Neurological Disorders and Stroke, Liane Young, a graduate student in cognitive psychology at Harvard University, and their colleagues found that damage to a brain area known as the ventromedial prefrontal cortex (VMPFC, a region of the prefrontal cortex located above our eye sockets) increases a preference for “utilitarian” choices in moral dilemmas—judgments that favor the aggregate welfare over the welfare of fewer individuals. The study adds to an already hot debate about how we juggle facts and emotion to make moral decisions.

Rationalizing Morality
Koenigs, Young and their collaborators gave a test on moral decision making to three different groups of people: six patients with bilateral VMPFC damage, another group of neurologically normal control subjects and a group of patients with lesions in other brain regions. The test subjects confronted decision-making scenarios in four main classes. One class contained “high conflict” (morally ambiguous) and emotionally salient “personal” moral scenarios, such as whether to push a bulky stranger onto the track of a runaway trolley (thus killing the stranger) if doing so would save the lives of five workers down the line. A second class contained “low conflict” (morally unambiguous) but highly personal scenarios, such as whether it would be moral for a man to hire someone to rape his wife so he could later comfort her and win her love again. A third class offered morally ambiguous but relatively nonpersonal scenarios, such as whether it would be okay to lie to a security guard and “borrow” a speedboat to warn tourists of a deadly impending storm. A fourth class consisted of ambiguous but nonmoral scenarios, such as whether to take the train instead of the bus to arrive somewhere punctually.

In the clear-cut, low-conflict personal scenarios, the VMPFC patients and controls performed alike, unanimously responding “no” to examples such as the one mentioned above. But when pondering the more emotionally charged high-ambiguity situations, the VMPFC patients were much more likely than others to endorse utilitarian decisions that would lead to greater aggregate welfare. They were far more willing than others were, for instance, to push that one fellow passenger in front of the train to save the group of workers down the track.

Reason vs. Emotion?
Why should people who have damage to the VMPFC show greater preference for utilitarian choices? It is tempting to attribute this preference to a general emotional blunting—a trait commonly found in patients with prefrontal damage. Reduced emotion would presumably make these patients more prone to utilitarian reasoning. But an earlier study that Koenigs and Daniel Tranel, a neurology professor at the University of Iowa Hospitals and Clinics, did with VMPFC-damaged patients argues otherwise. In that study, VMPFC patients played the “ultimatum game.”

In this game, a pair of players is offered a sum of money. Player A proposes some division of the money with player B; if player B rejects the proposed division, neither player gets any money. For player B, the strictly utilitarian decision is to accept any proposal, even if he or she gets only 1 percent of the money, because rejecting the offer means no gain at all. But most people will reject highly imbalanced offers because such offers offend their sense of fairness. The VMPFC players, however, rejected imbalanced offers more often than control subjects did—apparently because they allowed an insult over the inequitable but profitable proposal to overrule utilitarian reason. Overall emotional dullness and increased utilitarian reasoning thus seem unlikely explanations for the behavior of VMPFC patients.

A more parsimonious account, hypothesized in a Nature Reviews Neuroscience paper, is that reason and emotion cooperate to produce moral sentiments. The VMPFC would be especially important for the so-called prosocial sentiments. These feelings include guilt, compassion and empathy, and they emerge when states such as sadness and affiliation, which rise from limbic areas, are integrated with other mechanisms mediated by anterior sectors of the VMPFC, such as prospective evaluation of salient outcomes. Functional imaging studies support this idea. As we describe in a 2007 paper in Social Neuroscience and in previous research, the VMPFC is engaged not just when people make explicit moral judgments but also when they are passively exposed to stimuli evocative of prosocial moral sentiments (such as a hungry child). Interestingly, the anterior VMPFC was engaged when volunteers chose to sacrifice money to donate to charities—a decision that is both utilitarian and emotional—as we describe in a 2006 paper in the Proceedings of the National Academy of Sciences USA.

The impairment of prosocial sentiments, resulting from damage to the ventral part (or underside) of the prefrontal cortex, along with a preserved capacity to experience aversive emotional reactions associated with anger or frustration (relying more on lateral sectors of the PFC and subcortical connections), could explain the otherwise puzzling results of the two Koenigs studies. The VMPFC-damaged patients playing the ultimatum game, for instance, let emotions such as anger and contempt steer nonutilitarian decisions to reject unfair offers. VMPFC patients were more utilitarian when facing difficult moral dilemmas, however, because the damage to the ventral parts of their prefrontal cortex reduced their prosocial sentiments, giving a relative advantage to coldhearted reasoning.

Not So Simple Juggling
This explanation returns us to Einstein’s dilemma. Einstein’s letter to Roosevelt helped to prepare the U.S. to build the first atomic bombs. Those bombs killed tens of thousands of civilians—but in doing so, they ended World War II. Was Einstein’s utilitarian choice cold-blooded, resulting from emotions being overpowered by pure cognition? We do not think so. Einstein’s reason and sentiments seem to have been working together just fine, reflecting fully the interplay of thought, emotion, empathy and foresight—as well as anguish and ambivalence—that complex moral decisions entail.

(Further Reading)

  • Opinion: The Neural Basis of Human Moral Cognition. J. Moll, R. Zahn, R. de Oliveira-Souza, F. Krueger and J. Grafman in Nature Reviews Neuroscience, Vol. 6, No. 10, pages 799–809; October 2005.

  • Human Fronto-mesolimbic Networks Guide Decisions about Charitable Donation. J. Moll, F. Krueger, R. Zahn, M. Pardini, R. de Oliveira-Souza and J. Grafman in Proceedings of the National Academy of Sciences USA, Vol. 103, No. 42, pages 15623–15628; October 17, 2006.

  • The Science of Good and Evil. Michael Shermer. Times Books, 2006.

  • The Self as a Moral Agent: Linking the Neural Bases of Social Agency and Moral Sensitivity. J. Moll, R. de Oliveira-Souza, G. J. Garrido, I. E. Bramati, E.M.A. Caparelli-Daquer, M.L.M.F. Paiva, R. Zahn and J. Grafman in Social Neuroscience, Vol. 2, Nos. 3–4, pages 336–352; January 2007.

  • Irrational Economic Decision-Making after Ventromedial Prefrontal Damage: Evidence from the Ultimatum Game. Michael Koenigs and Daniel Tranel in Journal of Neuroscience, Vol. 27, No. 4, pages 951–956; January 24, 2007. Available at http://tinyurl.com/2uzjy3

  • Damage to the Prefrontal Cortex Increases Utilitarian Moral Judgments. M. Koenigs, L. Young, R. Adolphs, D. Tranel, F. Cushman, M. Hauser and A. Damasio in Nature, Vol. 446, pages 908–911; April 19, 2007.

SA Mind Vol 19 Issue 1This article was originally published with the title “When Morality Is Hard to Like” in SA Mind Vol. 19 No. 1 (), p. 30
doi:10.1038/scientificamericanmind0208-30