Sometimes people say it's okay to sacrifice one life to save five others. Other times, people say it's wrong. Philosophers have debated for decades why hypothetical moral dilemmas that are logically identical can elicit different answers. Now a brain imaging study suggests that people's emotional responses to certain dilemmas guide their reasoning.
Suppose, in a classical moral dilemma, you see a runaway trolley with five frightened people in it headed for a cliff. They can be saved if you hit a switch and send the trolley onto another track where, tragically, another person is standing who would be killed by the trolley. What to do? Most people say that it's worth sacrificing one life to save five others.
But suppose the doomed trolley can only be saved if you push a bulky person onto the tracks, where his body would stop the trolley but, alas, he would be crushed to death. Although faced with the same trade-off of five lives for one, most people say it would be wrong to stop the trolley this way.
Intrigued by the dilemma of the moral dilemmas, a team led by Joshua Greene, a philosophy grad student at Princeton University in New Jersey, used functional magnetic resonance imaging to spy on people's brains while they read and reasoned their way through a number of scenarios. Some resembled the "switch tracks" dilemma, others the "push body," and some had no apparent moral component, such as deciding whether to take a bus or train to some destination. While deliberating the body-pushing set of moral dilemmas--but not the other scenarios--emotion areas of the brain lit up, the team reports in the 14 September issue of Science.
"From a utilitarian point of view, these situations are identical," says psychologist Jon Haidt of the University of Virginia in Charlottesville; "they differ only in that one of them feels wrong." Greene points out that the study doesn't resolve whether it's right or wrong to push someone into the path of a trolley, but it does begin to answer a related question: how people decide what's right and wrong.