Advertisement

Beyond Good And Evil: New Science Casts Light On Morality In The Brain

09:28
Download Audio
Resume

Harvard brain scientist Joshua Buckholtz has never forgotten a convict he met back when he was an undergrad conducting psychological tests in prisons. The man had beaten another man nearly to death for stepping on his foot in a dance club.

"I wanted to ask him," he recalls, "'In what world was the reward of beating this person so severely, for this — to me — minor infraction, worth having terrible food and barbed wire around you?' "

But over the years, Buckholtz became convinced that this bad deed was a result of faulty brain processing, perhaps in a circuit called the frontostriatal dopamine system. In an impulsive person's brain, he says, attention just gets so narrowly focused on an immediate reward that, in effect, the future disappears.

He explains: "If you had asked this person, ‘What will happen if you beat someone nearly to death?’, they will tell you, ‘Oh, I’ll be put away.’ It’s not that these people who commit crimes are dumb, but what happens is, in the moment, that information about costs and consequences can’t get in to their decision-making."

For two decades, researchers have scanned and analyzed the brains of psychopaths and murderers, but they haven’t pinpointed any single source of evil in the brain. What they've found instead, as Buckholtz puts it, "is that our folk concepts of good and evil are much more complicated, and multi-faceted, and riven with uncertainty than we ever thought possible before."

In other words, so much for the old idea that we have an angel on one shoulder and a devil on the other, and that morality is simply a battle between the two. Using new technology, brain researchers are beginning to tease apart the biology that underlies our decisions to behave badly or do good deeds. They're even experimenting with ways to alter our judgments of what is right and wrong, and our deep gut feelings of moral conviction.

One thing is certain: We may think in simple terms of "good" and "evil," but that's not how it looks in the brain at all.

In past years, as neuroscientists and psychologists began to delve into morality, "Many of us were after a moral center of the brain, or a particular system or circuit that was responsible for all of morality," says assistant professor Liane Young, who runs The Morality Lab at Boston College. But "it turns out that morality can’t be located in any one area, or even set of areas — that it's all over, that it colors all aspects of our life, and that's why it takes up so much space in the brain."

So there's no "root of all evil." Rather, says Buckholtz, "When we do brain studies of moral decision-making, what we are led into is an understanding that there are many different paths to antisocial behavior."

If we wanted to build antisocial offenders, he says, brain science knows some of the recipe: They’d be hyper-responsive to rewards like drugs, sex and status — and the more immediate, the better. "Another thing we would build in is an inability to maintain representations of consequences and costs," he says. "We would certainly short-circuit their empathic response to other people. We would absolutely limit their ability to regulate their emotions, particularly negative emotions like anger and fear."

At his Harvard lab, Buckholtz is currently studying the key ability that long-ago convict lacked — to weigh future consequence against immediate gratification. In one ongoing experiment (see the video above), he’s testing whether he can use electrical stimulation to alter people’s choices.

The researchers paste electrodes on the subject’s head to zap (harmlessly) an area of the brain involved in decision-making, the dorsolateral prefrontal cortex. Then they ask questions along the lines of, "I have a $20 bill here. It’s yours today, but you can wait three weeks and I’ll give you $25. Which would you choose?"

The idea is to use the brain stimulation to make the subject value a future reward more, compared to an immediate one. With techniques like this, researchers aim to identify and ultimately tweak circuits in the brain involved in moral decision-making.

"One of the long-term goals of this research is to help people who chronically make bad decisions," Buckholtz says. "Veterans with traumatic brain injuries, for example. They have terrible trouble weighing these cost-benefit kind of decisions, and the impulsivity you see in these patients, we think, might come of the fact that they’re unable to keep in mind these long-term representations of cost and consequence when faced with immediate rewards."

Stop That Trolley!

Research shows that brain biology governs not just our choices but also our moral judgments about what is right and wrong.

For example, Harvard psychology professor Joshua D. Greene uses a classic moral dilemma to illustrate how two very different systems in the brain can affect how we make those judgments. He presented it recently to a Harvard Alumni Association audience:

As you watch from a footbridge with a big man standing next to you, a runaway trolley heads down the track toward five people. "And the only way to save these five people is to push this big guy off," he says. "He lands on the tracks and gets squashed by the train and he dies, but blocks the trolley so the five people can live."

Most people say it’s not OK to push the man in front of the trolley. Why? Greene says it’s a gut response, a snap judgment.

"You get this response in a part of the brain called the amygdala, which is like your brain’s early warning alarm system that something needs attention and it might be bad," he explains. "You get a strong amygdala response when people think about pushing the guy off the footbridge."

But, Greene says, there's another decision-making system in a different part of the brain — the prefrontal control network — which enables us, with deliberation, to conclude that it makes more sense to save more lives.

"It’s a kind of fast decision that makes you say, 'No, don’t push the guy off the footbridge,' " he says, "and it’s a kind of slow decision that makes you say, 'But you can save more lives! Five versus one.' "

Change the brain chemistry with commonly used drugs, and you can shift the decision.

"If you give people a drug called citalopram," which can increase negative emotions short-term, Greene says, "and they say no, you can’t push the guy off the footbridge, or they’re more likely to. If you give people an anti-anxiety drug, then they are more likely to say, 'Sure you can push the guy off the footbridge.' "

That's Just Wrong!

At The Morality Lab at Boston College, Liane Young's team is finding that changing your brain biology can even affect how you feel about your judgments of right and wrong — for example, about deeply felt moral positions on issues like abortion, or eating meat.

Graduate student Jordan Theriault demonstrates (see the video below) on lab manager Amelia Brown: Using a device about the size of two doughnuts stuck together with a handle, he delivers a magnetic field to stimulate a brain area just above the subject’s right ear. Meanwhile, she reads statements and decides whether they’re more like fact or opinion:

- It is wrong to cheat at games such as Monopoly.
- It is not OK for doctors to accidentally kill a small number of patients per year.
- Dog racing is exploitative and harmful to the dogs being raced.
- Parents should be willing to make sacrifices for the benefit of a baby.
- Universal donors should not be obligated to donate blood.

The researchers are finding that if they disrupt the brain areas that we use to figure out other people's intentions and perspectives, our own opinions tend to feel more like objective facts.

"We’re starting to see that we can shift around people’s perceptions of what feels objectively right," Young says.

And that may be hard for many people to accept.

"People tend to think of moral judgments as very central to their character," she says. "So showing people that their judgments and their perceptions are flexible when it comes to morality — it's pretty striking."

The Really Big Question

So here’s the big question that gives me no peace: If it's all just biology at work, are we still to blame if we commit a crime? And the correlate: Can we still take credit when we do good?

I must confess, I pestered Buckholtz quite a bit on this. Finally, patiently but firmly, he said that these questions keep him up at night, too, but: "These questions of will and credit and blame, while important, are in some sense, above my paygrade. They’re outside the purview of the things that I as a neuroscientist can speak to. And I don’t think that at the end of the day, neuroscience is going to give you a definitive, coherent, final answer about these questions that will make you comfortable."

He suggested asking a philosopher. So I did.

Daniel Dennett is a professor of philosophy at Tufts University who incorporates neuroscience into his thinking and is a seasoned veteran of the debates around free will. [See the full interview here.] He says it’s not news that our morality is based in our brains, and he doesn’t have much patience for excuses like, “My brain made me do it.”

"Of course my brain made me do it!" he says. "What would you want, your stomach to make you do it!?"

The age-old debate around free willis still raging on in philosophical circles, with new brain science in the mix. But Dennett argues that science doesn’t change the basic facts:

"If you do something on purpose and you know what you’re doing, and you did it for reasons good, bad or indifferent, then your brain made you do it," he says. "Of course. And it doesn’t follow that you were not the author of that deed. Why? Because you are your embodied brain."

Dennett’s not worried that we’ll all be letting ourselves, and each other, off the hook.

"Responsibility is alive and well and will go on being alive and well," he says.

But what we're learning about the biological basis of moral behavior may shift how we feel about punishing wrongdoers.

A recent study finds that learning about neuroscience makes people recommend shorter prison sentences for criminals. It concludes that as knowledge of neuroscience filters into society, attitudes about moral responsibility may change: Punishment may become less about retribution and more about practical considerations — deterring crime or keeping dangerous criminals off the streets.

Harvard’s Greene sees potential benefit in this: "If understanding that these are mechanical failures, as opposed to a deep failure of your soul, helps people be more understanding and constructive," he says, "then that’s good."

Headshot of Carey Goldberg

Carey Goldberg Editor, CommonHealth
Carey Goldberg is the editor of WBUR's CommonHealth section.

More…

Advertisement

More from WBUR

Listen Live
Close