Our lives are governed by both fast and slow – by quick, intuitive decisions based on our gut feelings; and by deliberate, ponderous ones based on careful reflection. How do these varying speeds affect our choices? Consider the many situations when we must put our own self-interest against the public good, from giving to charity to paying out taxes. Are we naturally prone to selfishness, behaving altruistically only through slow acts of self-control? Or do we intuitively reveal our better angels, giving way to self-interest as we take time to think?
According to David Rand from Harvard University, it’s the latter. Through a series of experiments, he has found that, on average, people behave more selflessly if they make decisions quickly and intuitively. If they take time to weigh things up, cooperation gives way to selfishness. The title of his paper – “Spontaneous giving and calculated greed” – says it all.
If you ask someone to guess the number of sweets in a jar, the odds that they’ll land upon the right number are low – fairground raffles rely on that inaccuracy. But if you ask many people to take guesses, something odd happens. Even though their individual answers can be wildly off, the average of their varied guesses tends to be surprisingly accurate.
This phenomenon goes by many names – swam intelligence, wisdom of the crowd, vox populi, and more. Whatever it’s called, the principle is the same: a group of people can often arrive at more accurate answers and better decisions than individuals acting alone. There are many examples, from counting beans in a jar, to guessing the weight of an ox, to the Ask The Audience option in Who Wants to be a Millionaire?
But all of these examples are somewhat artificial, because they involve decisions that are made in a social vacuum. Indeed, James Surowiecki, author of The Wisdom of Crowds, argued that wise crowds are ones where “people’s opinions aren’t determined by the opinions of those around them.” That rarely happens. From votes in elections, to votes on social media sites, people see what others around them are doing or intend to do. We actively seek out what others are saying, and we have a natural tendency to emulate successful and prominent individuals. So what happens to the wisdom of the crowd when the crowd talks to one another?
In 2007, one Jamie Langridge became $50,000 richer after winning intense national tournament in Las Vegas. Langridge beat his opponent decisively, with a classic open-hand technique. The sport? Rock-paper-scissors.
Rock-paper-scissors seems deceptively simple. Pairs of opponents display one of three hand gestures. Paper covers rock, rock blunts scissors, and scissors cut paper. It’s so straightforward that children the world over learn to play it. But this is not just a game of chance. Played at the highest level, it becomes a game of psychological strategy, one that justifies five-figure trophies in large competitions and even the publication of strategy guides.
Such advanced games are possible because people don’t choose their hand shapes randomly. They are affected by moves that have gone before, and what other people are doing. Consider a new experiment by Richard Cook at University College London. Cook asked 45 people to face off against each other in several rounds of rock-paper-scissors, in exchange for real money. In every game, either one or both players were blindfolded.
Cook found that the players drew with each other more often when one of them could see (36.3% of the matches) than when both were blindfolded (33.3% of them). The latter figure was exactly the proportion of draws you’d expect if the players were choosing randomly; the former was significantly higher than chance.
Countries around the world have tried many tactics to encourage people to vote, from easier access to polling stations to mandatory registration. But Christopher Bryan from Stanford University has found a startlingly simple weapon for increasing voter turnout – the noun. Through a simple linguistic tweak, he managed to increase the proportion of voters in two groups of Americans by at least 10 percentage points.
During the 2008 presidential election, Bryan recruited 34 Californians who were eligible to vote but hadn’t registered yet. They all completed a survey which, among other questions, asked them either “How important is it to you to be a voter in the upcoming election?” or “How important is it to you to vote in the upcoming election?”
It was the tiniest of tweaks – the noun-focused “voter” versus the verb-focused “vote” – but it was a significant one. Around 88% of the noun group said they were very or extremely interested in registering to vote, compared to just 56% of the verb group.
As a visitor to the USA, one sometimes gets the feeling that it’s hard to move or look around without seeing a flag. They are seemingly everywhere, an omnipresent reminder of national identity. But the star-spangled banner is more than a symbol; it can also influence minds in unexpected ways. Travis Carter from the University of Chicago has found that when people think about voting decisions, the mere sight of the American flag can subtly shift their political views… towards Republicanism. It’s an effect that holds in both Democrats and Republicans, it affects actual votes, and it lasts for at least 8 months.
In the run-up to the 2008 US presidential election, Carter recruited a group of around 200 volunteers and asked them about their political views. A month or so later, he split them into two groups that were comparable in terms of their political beliefs, voting intentions and other variables. Both groups rated how likely they were to vote for either the Democrat Barack Obama or the Republican John McCain on an online questionnaire. The questionnaires were identical except for one small detail – in the top left corner of the screen, one group saw a small American flag and the other saw nothing.
That tiny difference was enough to swing their voting preferences. Carter found that the volunteers who saw the tiny flag became more likely to vote for McCain than Obama (relative to their answers at the start of the experiment). They claimed to feel more positive towards the Republicans and even when Carter tested their unconscious atittudes, a small Republican bias still came through.
I’m at a supermarket, and I want bacon. There’s Danish or British, streaky or back, smoked or unsmoked. My quest for bread leads to a choice between white, brown, seeded, malt, thick-sliced or thin-sliced. Lettuce: romaine, gem, iceberg. Tomatoes: cherry vine, classic, baby plum, organic.
It should not be this complicated to assemble a BLT.
People in Western countries drown in choice. Want a T-shirt? Thousands of alternatives await you. Want some toothpaste? Sit down, we could be here a while. Many people see these options as a good thing – they’re a sign of our independence, our freedom, our mastery over our own destinies. But these apparent positives have a dark side.
Krishna Savani from Columbia University has found that when Americans think about the concept of choice, they’re less concerned about the public good and less empathic towards disadvantaged people. His work supports the idea that endless arrays of choice focus our attention on individual control and, by doing so, they send a message that people’s fates are their own concerns. Their lives are not the business of the state or public institutions, and if they fail, it is their own fault. With choices at hand, Americans are more likely to choose themselves.
There’s an old trope that says justice is “what the judge ate for breakfast”. It was coined by Jerome Frank, himself a judge, and it’s a powerful symbol of the legal realism movement. This school of thought holds that the law, being a human concoction, is subject to the same foibles, biases and imperfections that affect everything humans do. We’d love to believe that a judge’s rulings are solely based on rational decisions and written laws. In reality, they can be influenced by irrelevant things like their moods and, as Frank suggested, their breakfasts.
The graph above is almost the visual embodiment of Frank’s catchphrase. It’s the work of Shai Danziger from Ben Gurion University of the Negev, and summarises the results of 1,112 parole board hearings in Israeli prisons, over a ten month period. The vertical axis is the proportion of cases where the judges granted parole. The horizontal axis shows the order in which the cases were heard during the day. And the dotted lines, they represent the points where the judges went away for a morning snack and their lunch break.
The graph is dramatic. It shows that the odds that prisoners will be successfully paroled start off fairly high at around 65% and quickly plummet to nothing over a few hours (although, see footnote). After the judges have returned from their breaks, the odds abruptly climb back up to 65%, before resuming their downward slide. A prisoner’s fate could hinge upon the point in the day when their case is heard.
UPDATE: Diederik Stapel, who led this study, has been accused of fabricating data and has been suspended from his post. It is not clear which of his papers are at stake, but until further details emerge, it would probably be best to take this paper and post with a pinch of salt.
UPDATE 2: This paper has now been officially retracted. As is this post.
In February 2010, cleaners working at Dutch railway stations went on strike for several weeks. Their stations quickly fell to dirtiness and disarray, but most people didn’t mind; public support for the strike was high. But two scientists – Diederik Stapel and Siegwart Lindenberg from Tilburg University – were particularly delighted. In the growing chaos of the stations, they saw an opportunity to test an intriguing concept – that disorderly environments promote stereotypes and discrimination. Their big idea is that stereotypes, being a set of simplified categories and judgements, can help people to cope with chaos. They are “a mental cleaning device in the face of disorder”. When our surroundings are full of chaos – be it dirt or uncertainty – we react by seeking order, structure and predictability. Stereotypes, for all their problems, satisfy that need.
To test that, the duo went to Utrecht station after it hadn’t been cleaned for a few days and asked 40 travellers to fill in a questionnaire. Their task was to say how much Dutch, Muslim and homosexual people conform to different personality traits. When the cleaners returned to work, and the station had reverted to its usual spick self, Stapel and Lindenberg repeated their experiment.
“I am on a drug. It’s called Charlie Sheen. It’s not available because if you try it, you will die. Your face will melt off and your children will weep over your exploded body.” – Charlie Sheen
“We put our fingers in the eyes of those who doubt that Libya is ruled by anyone other than its people.” – Muammar Gaddafi
You don’t have to look far for instances of people lying to themselves. Whether it’s a drug-addled actor or an almost-toppled dictator, some people seem to have an endless capacity for rationalising what they did, no matter how questionable. We might imagine that these people really know that they’re deceiving themselves, and that their words are mere bravado. But Zoe Chance from Harvard Business School thinks otherwise.
Using experiments where people could cheat on a test, Chance has found that cheaters not only deceive themselves, but are largely oblivious to their own lies. Their ruse is so potent that they’ll continue to overestimate their abilities in the future, even if they suffer for it. Cheaters continue to prosper in their own heads, even if they fail in reality.
Chance asked 76 students to take a maths test, half of whom could see an answer key at the bottom of their sheets. Afterwards, they had to predict their scores on a second longer test. Even though they knew that they wouldn’t be able to see the answers this time round, they imagined higher scores for themselves (81%) if they had the answers on the first test than if they hadn’t (72%). They might have deliberately cheated, or they might have told themselves that they were only looking to “check” the answers they knew all along. Either way, they had fooled themselves into thinking that their strong performance reflected their own intellect, rather than the presence of the answers.
And they were wrong – when Chance asked her recruits to actually take the hypothetical second test, neither group outperformed the other. Those who had used the answers the first-time round were labouring under an inflated view of their abilities.
Even the simplest of actions can arise from hidden competition. Imagine reaching for a cup. There are many ways of doing this, depending on where the cup is, whether you’re right- or left-handed, whether the handle is turned towards you, and so on. Your brain works through all of these possibilities at the same time, and they compete against one another until one wins out. Only after this neural battle does your arm start to move.