For many religious people, the popular question “What would Jesus do?” is essentially the same as “What would I do?” That’s the message from an intriguing and controversial new study by Nicholas Epley from the University of Chicago. Through a combination of surveys, psychological manipulation and brain-scanning, he has found that when religious Americans try to infer the will of God, they mainly draw on their own personal beliefs.
Psychological studies have found that people are always a tad egocentric when considering other people’s mindsets. They use their own beliefs as a starting point, which colours their final conclusions. Epley found that the same process happens, and then some, when people try and divine the mind of God. Their opinions on God’s attitudes on important social issues closely mirror their own beliefs. If their own attitudes change, so do their perceptions of what God thinks. They even use the same parts of their brain when considering God’s will and their own opinions.
Religion provides a moral compass for many people around the world, colouring their views on everything from martyrdom to abortion to homosexuality. But Epley’s research calls the worth of this counsel into question, for it suggests that inferring the will of God sets the moral compass to whatever direction we ourselves are facing. He says, “Intuiting God’s beliefs on important issues may not produce an independent guide, but may instead serve as an echo chamber to validate and justify one’s own beliefs.”
Epley asked different groups of volunteers to rate their own beliefs about important issues such as abortion, same-sex marriage, affirmative action, the death penalty, the Iraq War, and the legalisation of marijuana. The volunteers also had to speculate about God’s take on these issues, as well as the stances of an “average American”, Bill Gates (a celebrity with relatively unknown beliefs) and George Bush (a celebrity whose positions are well-known).
In a world where the temptation to lie, deceive and cheat is both strong and profitable, what compels some people to choose the straight and narrow path? According to a new brain-scanning study, honest moral decisions depend more on the absence of temptation in the first place than on people wilfully resisting these lures.
Joshua Greene and Joseph Paxton and Harvard University came to this conclusion by using a technique called functional magnetic resonance imaging (fMRI) to study the brain activity of people who were given a chance to lie. The volunteers were trying to predict the outcomes of coin-flips for money and they could walk away with more cash by lying about their accuracy.
The task allowed Greene and Paxton to test two competing (and wonderfully named) explanations for honest behaviour. The first -the “Will” hypothesis – suggests that we behave morally by exerting control over the desire to cheat. The second – the “Grace” hypothesis – says that honesty is more a passive process than an active one, fuelled by an absence of temptation rather than the presence of willpower. It follows on from a growing body of psychological studies, which suggest that much of our behaviour is governed by unconscious, automatic processes.
Many studies (and several awful popular science articles) have tried to place brain-scanning technology in the role of fancy lie detectors but in almost all of these cases, people are told to lie rather than doing so spontaneously. Greene and Paxton were much more interested in what happens in a person’s brain when they make the choice to lie.
They recruited 35 people and asked them to predict the result of computerised coin-flips while sitting in an fMRI scanner. They were paid in proportion to their accuracy. In some ‘No-Opportunity trials’, they had to make their predictions beforehand, giving them no room for cheating. In other ‘Opportunity trials’, they simply had say whether they had guessed correctly after the fact, opening the door to dishonesty.
To cover up the somewhat transparent nature of the experiment, Greene and Paxton fibbed themselves. They told the recruits that they were taking part in a study of psychic ability, where the idea was that people were more clairvoyant if their predictions were private and motivated by money. Under this ruse, the very nature of the “study” meant that people had the opportunity to lie, but were expected not to.