A new study may shed light on how much weight we place on our sense of fairness when we are faced with an ethical dilemma.
Researchers from the California Institute of Technology and the University of Illinois, Champaign-Urbana say that their recent work shows that differences in our moral decisions are tied to how much weight our brains place on fairness.
"What was really surprising, in terms of the results that came out, was that it showed our sense of fairness is rooted in emotional processing, and there's a great deal of individual differences," said Steven Quartz, director of the Social Cognitive Neuroscience Laboratory at Cal Tech and the lead author on the study.
The study is published in the May 9 issue of Science.
The researchers conducted brain scans on 26 people as they played an electronic game designed to find out the choices they would make during a food shortage in an orphanage in Uganda.
Subjects chose whether to take a greater number of meals from everybody or to take fewer meals, when that meant leaving a few children with a lot less than everyone else.
As subjects were forced to quickly decide which children would lose meals, researchers noted which areas of their brains "lit up" — an indication that these brain areas were playing a part in the final decision.
Researchers found that regions of the brain that decided efficiency — a logical area — lit up about the same for everyone. But, regions involved with equity — an emotional area — lit up to varying degrees depending on the person.
Those differences, researchers say, accounted for how much each individual was taking their own sense of fairness into account when making a decision.
"When you see unfairness," said Quartz, "it evokes a negative emotion in you; you feel uncomfortable with it. The amount of 'uncomfortable' is different in different people."
"[The study] provides further evidence for the force of our emotional energy in making difficult moral decisions, said Judy Illes, a neuroscientist at the University of British Columbia, who was not involved in the research. "I think it is an extremely robust and elegant report."
But, she continued, "despite the extremely thoughtful way they conducted their research and their discussion ... I'm compelled to urge caution in making broad conclusions about the ways societies function based on studies of 20-some people in a lab setting in an MRI scanner in Pasadena.
"People may try to take these results and apply them to other kinds of human behavior and phenomena outside the context of this laboratory study."
One of the novel things about this study, said Quartz, was that subjects were not given a long time to think, as with many classroom moral dilemmas. Instead, they had to decide in a little less than 10 seconds which children would be losing meals.
"It captures the idea that you're not allowed to have an infinite amount of time to make a decision. In real life you have to make decisions in a certain amount of time," he said.
Illes had some questions about the scenario used but still believes it is a useful setup overall.
"We typically take longer than 10 seconds to make decisions about how to deliver food to children in northern Uganda," she said.
However, Illes said, "As much as it is discordant with the actual decision making that would be used to deliver food to children in Uganda, it mimics rapid decision making."