Sacred Values

Principles on which we refuse to change our stance are processed via separate neural pathways from those we’re more flexible on, says a new study.

Some of our values can be more flexible than others...

Our minds process many decisions in moral “gray areas” by weighing the risks and rewards involved – so if the risk is lessened or the reward increased, we’re sometimes willing to change our stance. However, some of our moral stances are tied to much more primal feelings – “gut reactions” that remind us of our most iron-clad principles: don’t hurt innocent children, don’t steal from the elderly, and so on.

These fundamental values – what the study calls “sacred” values (whether they’re inspired by religious views or not) – are processed heavily by the left temporoparietal junction (TPJ), which is involved in imagining others’ minds; and by the left ventrolateral prefrontal cortex (vlPFC), which is important for remembering rules. When especially strong sacred values are called into question, the amygdala – an ancient brain region crucial for processing negative “gut” reactions like disgust and fear – also shows high levels of activation.

These results provide some intriguing new wrinkles to age-old debates about how the human mind processes the concepts of right and wrong. See, in many ancient religions (and some modern ones) rightness and wrongness are believed to be self-evident rules, or declarations passed down from on high. Even schools that emphasized independent rational thought – such as Pythagoreanism in Greece and Buddhism in Asia – still had a tendency to codify their moral doctrines into lists of rules and precepts.

But as scientists and philosophers like Jeremy Bentham and David Hume began to turn more analytical eyes on these concepts, it became clear that exceptions could be found for many “absolute” moral principles – and that our decisions about rightness and wrongness are often based on our personal emotions about specific situations.

The epic battle between moral absolutism and moral relativism is still in full swing today. The absolutist arguments essentially boil down to the claim that without some bedrock set of unshakable rules, it’s impossible to know for certain whether any of our actions are right or wrong. The relativists, on the other hand, claim that without some room for practical exceptions, no moral system is adaptable enough for the complex realities of this universe.

But now, as the journal Philosophical Transactions of the Royal Society B: Biological Sciences reports, a team led by Emory University’s Gregory Berns has analysed moral decision-making from a neuroscientific perspective – and found that our minds rely on rule-based ethics in some situations, and practical ethics in others.

The team used fMRI scans to study patterns of brain activity in 32 volunteers as the subjects responded “yes” or “no” to various statements, ranging from the mundane (e.g., “You are a tea drinker”) to the incendiary (e.g., “You are pro-life.”).

At the end of the questionnaire, the volunteers were offered the option of changing their stances for cash rewards. As you can imagine, many people had no problem changing their stance on, say, tea drinking for a cash reward. But when they were pressed to change their stances on hot-button issues, something very different happened in their brains:

We found that values that people refused to sell (sacred values) were associated with increased activity in the left temporoparietal junction and ventrolateral prefrontal cortex, regions previously associated with semantic rule retrieval.

In other words, people have learned to process certain moral decisions by bypassing their risk/reward pathways and directly retrieving stored “hard and fast” rules.

This suggests that sacred values affect behaviour through the retrieval and processing of deontic rules and not through a utilitarian evaluation of costs and benefits.

Of course, this makes it much easier to understand why “there’s no reasoning” with some people about certain issues – because it wasn’t reason that brought them to their stance in the first place. You might as well try to argue a person out of feeling hungry.

That doesn’t mean, though, that there’s no hope for intelligent discourse about “sacred” topics – what it does mean is that instead of trying to change people’s stances on them through logical argument, we need to work to understand why these values are sacred to them.

For example, the necessity of slavery was considered a sacred value all across the world for thousands of years - but today slavery is illegal (and considered morally heinous) in almost every country on earth. What changed? Quite a few things, actually – industrialization made hard manual labor less necessary for daily survival; overseas slaving expeditions became less profitable; the idea of racial equality became more popular…the list could go on and on, but it all boils down to a central concept: over time, the needs slavery had been meeting were addressed in modern, creative ways – until at last, most people felt better not owning slaves than owning them.

My point is, if we want to make moral progress, we’ve got to start by putting ourselves in the other side’s shoes – and perhaps taking a more thoughtful look at out own sacred values while we’re at it.

Share this post…
Email Facebook Twitter Linkedin Digg Delicious Reddit Stumbleupon Tumblr
You can leave a response, or trackback from your own site.

4 Responses to “Sacred Values”

  1. george altman says:

    Very interesting! Certainly makes the case for the need for more empathy in today’s culture. And, the role of mirror neurons in helping the process along. From a political perspective it brings to mind possible reasons for why, in face of the reality of the “facts,”people hold on tightly to certain beliefs. Perhaps, they respond from a amygdala response – powerful emotional responses- because their beliefs are held as sacred.
    Thanks for the post!

  2. chiliv8 says:

    Thanks for the post, very interesting point – how hard-wired are “soft” cognitive concepts such as moral rules in the brain?

    I think the distinction between “hard and fast rules” and slower (i.e., computationally more expensive) “risk/reward rules” makes sense – similar to the fast/slow pathways in emotion processing.
    To your discussion of slavery: it would be interesting to test people on the item “Are you against slavery?” who employ a cleaning lady, although it’s only a 1-person household. Just kidding. But I know people who have absolutely no problem of watching others doing their work, while arguing on highest moral standards…

    P.S.: You have a nice list of “neuroscience heroes” :)

    • Ben says:

      I was talking with my mom about this post today, and she raised the point that once a “sacred” belief is established, the act of thinking about that belief often arouses feelings of contentment and pride – so whatever the neurophysiological correlates of a sacred belief are, they seem to be pretty talented at propagating and reinforcing themselves.

      Your “cleaning lady” idea made me think of another interesting point – whenever we start talking about “soft” cognitive concepts, it’s all too easy to slip from actual data into debates about semantics (see also: macroevolution/microevolution; taxonomy). I think it’d be interesting to study the differences between people’s conceptions of what exactly constitutes slavery in terms of pay/living conditions/etc. – I suspect we’d find quite a bit of variance by country, demographic, and so on.

      Yeah, I know… as usual, I’m getting all analytical about a silly offhand comment. What can I say – I love me some Wittgenstein…

Leave a Reply

Powered by WordPress | Designed by: free Drupal themes | Thanks to hostgator coupon and cheap hosting
Social links powered by Ecreative Internet Marketing