The best things in life can’t be bought…but everyone has his or her price. The age-old wisdom about values and money can be contradictory. Fortunately, new research from both sides of the Atlantic helps clear things up – scientists using brain scanning technology have found that people will give up some values for money but not others. The values that people won’t sell light up regions of the brain associated with rules, not with utilitarian choices, implying that those values aren’t the products of mere cost-benefit analyses. Some “sacred” choices and values, it seems, just aren’t for sale.
For a long time, economists, psychologists, and other social scientists have assumed that humans are driven to make decisions mostly by utilitarian criteria. That is, choices ranging from whether or not to buy organic foods to which part of town to live in would be the products of a complex, rational cost-benefit analysis, with decision-makers taking into account only such quantifiable factors as cost, nutrition, or the availability of good schools and transit.
When it comes to moral choices, such as whether or not to slap someone for talking on a cell phone in an otherwise quiet coffee shop, social scientists have often assumed that the same utilitarian criteria would apply. In these cases, the costs and benefits have to do with punishments and rewards: how severe would the legal response be? How much satisfaction would we get from expressing our frustration? Depending on how these equations play out, we choose to behave according to social norms, or not.
But in recent years new models for human decision-making have begun to gain popularity. In addition to the utility theory described above, a theory of “deontology” may explain many choices that people make. Deontology is the study of things that are right or wrong – morals, rules, and values. A deontic motivation for a choice would be one that stems from internal commitments to a moral system rather than pure cost-benefit analysis. Of course, different styles of thinking and decision-making are related to different processes in the brian. In theory, then, choices that arise from deontic commitments (“It would just be wrong to punch that guy!”) should activate different regions of the brain than decisions based on brute utilitarian analysis (“If I get a $500 fine, but I feel ten times better afterward, then it just might be worth it”).
A diverse group of scientists from Emory University in Atlanta, the New School in New York, and the CNRS-École Normale Supérieure in Paris found this hypothesis interesting enough to test the relationship between neurobiology and types of decision-making using functional magnetic resonance imaging, or fMRI. In a paper published this month in Philosophical Transactions of the Royal Society, these researchers reported that there are, indeed, certain “sacred values” that are deontic, rather than utilitarian, in nature, and that these values aren’t susceptible to change, even by monetary temptations.
After recruiting a large group of volunteers, the researchers had participants answer a series of questions while undergoing fMRI scans. The questions, presented in randomized pairs, asked the participants to decide between binary pairs of statements. Some statements were innocuous, such as “I am a Coke drinker” versus “I am a Pepsi drinker,” while others were far more value-laden: “I believe in God” versus “I do not believe in God,” or “I would (or would not) be willing to kill an innocent person.” After completing the first rounds of tests, the volunteers were then asked to state, hypothetically, what amount of money they would need to renege on their previous choices. For example, participants who indicated that they believed in God were asked how much money they would need to deny God’s existence for the rest of their lives.
The final phase of the experiments featured an auction, in which participants indicated the dollar amount they would accept in a bid to change their choices. The difference between this phase of the experiment and the hypothetical “sale” of choices was that the auction wasn’t hypothetical – volunteers had a chance to actually win money for their decision to retract their prior values. There was a catch, though: in order to actually cash in on their winnings, participants were required to print out and sign a copy of the auction form, effectively committing themselves in a personal, irrevocable way to their decisions. The researchers reasoned that this facet of the experiment would measure integrity, or the extent to which participants’ actual, real-life decisions matched their stated values.
As you might expect, some volunteers were willing to sell just about any of their values (no, the paper doesn’t say whether or not these individuals were marketing majors). But the majority clung tightly to a few, deeply held values, which no amount of money would convince them to give up. These values, eliciting high levels of integrity as defined by the researchers, were described as “sacred values.” (Keep in mind that these values didn’t necessarily have anything to do with God or religion, although there certainly was some overlap. Rather, any value choice for which study participants were unwilling to accept any monetary tradeoff to retract was described as a sacred value.)
Examining the fMRI data after the experiments, the researchers found a strong pattern of correlations. When volunteers had been presented with pair choices they would later treat as sacred values, two important regions of their brains were activated: the temporal-parietal junction (TPJ) and the ventro-lateral prefrontal cortex (VLPFC). The TPJ is related to processing moral choices of right and wrong, while the VLPFC is responsible for accessing and integrating semantic rules. In other words, subjects’ brains appeared to treat sacred values as set rules related to right and wrong, not as utilitarian, cost-benefit decisions.
Meanwhile, when subjects processed decisions that they were later happy to repudiate for cash, the orbitofrontal cortex and inferior parietal lobe were activated. These parts of the brain, the researchers reasoned, were deployed to process questions of rational utility.
This study is the first to concretely link neurobiology to deontic versus utilitarian decision-making. The results strongly suggest that values we consider “sacred” are processed in the brain as deontic, rule-based, and moral, and that no amount of utilitarian, cost-benefit reasoning ought to be able to get us to change our minds. If future research backs up these findings, it’ll have important implications for cross-cultural dialog, political and religious decision-making, and a wide variety of other spheres. In short, the carrot-and-stick model for influencing people’s values and decisions may be a bit unsophisticated. Some values, religious and otherwise, just can’t be coerced, bought, or tempted away from us. They’re bigger than that. They’re sacred.