This article was originally published on The Conversation.
Popular wisdom and established evolutionary science hold that the sexes seek fundamentally different relationships: men want short-term, no-strings-attached relationships whereas women value longer-term, loyal partnerships.
The explanation generally comes down to biological differences between men and women. Because women invest more in reproduction than men do – think pregnancy, morning sickness and stretchmarks – being picky becomes important because choosing poorly can be costly, even devastating. However, for men, reproduction may only entail a brief sexual liaison and a bit of sperm – there are potentially no long-term costs. This calculus has been built into our psychology, many argue.
Think about it more carefully, though. Where do all the women sleeping with these guys come from? Shouldn’t it be difficult for men to find so many willing partners? As theorist Hanna Kokko noted, it takes two to tango.
If we go by the numbers, in a group with an equal number of both sexes, it is impossible, on average, for men to have more partners than women. So why do we expect male psychology to be so hellbent on one-night stands? And why, clearly in opposition to this notion, are many men often so devotedly paternal?
Here’s where an established body of literature in sociology and demography – called mating market theory (MMT) – can help out. According to MMT, relationship preferences are expected to follow not simply from these fixed biological propensities, but also to be heavily influenced by partner availability.
When considering extreme environments it is easy to make assumptions about personality, which on closer examination do not stand up to scrutiny. Take, for example, one of the best-researched personality dimensions: introversion-extraversion. Extraversion as a trait appears in all established psychological models of personality, and there is considerable evidence that it has a biological basis. The concepts of introversion and extraversion long ago escaped the conﬁnes of academic psychology and are widely used in everyday conversation, albeit in ways that do not always reﬂect the psychological deﬁnitions.
Broadly speaking, individuals who score highly on measures of extraversion tend to seek stimulation, whereas those who score low tend to avoid it. When asked to describe a typical extravert, most people tend to think of the lively ‘party animal,’ equating extraversion with a preference for social interactions. However, individuals who score highly for extraversion seek more than just social stimulation: they also tend to gravitate toward other stimulating situations, including active leisure and work pursuits, travel, sex, and even celebrity. Introverts, on the other hand, have a generally lower affinity for stimulation.
They ﬁnd too much stimulation, of whatever type, draining rather than energizing. Contrary to popular belief, introverts are not necessarily shy or fearful about social situations, unless they also score highly on measures of social anxiety and neuroticism.
On this basis, one might assume that extraverts would be drawn to extreme environments, where they could satisfy their desire for stimulating situations, whereas introverts would ﬁnd them unattractive. And yet, extreme environments may also expose people to monotony and solitude — experiences that extraverts would ﬁnd aversive, but which are tolerated or even enjoyed by well-balanced introverts. The point here is that simple assumptions about broad personality traits are unlikely to provide good explanations of why people engage in extreme activities.
This article was originally published on The Conversation.
International airports are a busy place to be. Nearly 140,000 passengers pass through New York’s JFK Airport every day. The internal security of the country depends on effective airport checks.
All departing passengers pass through a series of security procedures before embarking their plane. One such procedure is a short, scripted interview when security personnel must make decisions about passenger risk by looking for behavioral indicators of deception.
These are referred to as “suspicious signs”: signs of nervousness, aggression and an unusual interest in security procedures, for example. However, this approach has never been empirically validated and its continued use is criticized for being based on outdated, unreliable perceptions of how people behave when being deceptive.
Despite these concerns, the suspicious signs approach continues to dominate security screening: the US government spends $200 million yearly on behavior-detection officers, who are tasked with spotting suspicious signs. This is a waste of money.
This article was originally published on The Conversation.
We’re getting more stupid. That’s one point made in a recent article in the New Scientist, reporting on a gradual decline in IQs in developed countries such as the UK, Australia and the Netherlands. Such research feeds into a long-held fascination with testing human intelligence. Yet such debates are too focused on IQ as a lifelong trait that can’t be changed. Other research is beginning to show the opposite.
The concept of testing intelligence was first successfully devised by French psychologists in the early 1900s to help describe differences in how well and quickly children learn at school. But it is now frequently used to explain that difference – that we all have a fixed and inherent level of intelligence that limits how fast we can learn.
Defined loosely, intelligence refers to our ability to learn quickly and adapt to new situations. IQ tests measure our vocabulary, our ability to problem-solve, reason logically and so on.
But what many people fail to understand is that if IQ tests measured only our skills at these particular tasks, no one would be interested in our score. The score is interesting only because it is thought to be fixed for life.
At the university where I teach, fewer and fewer new books are available from the library in their physical, printed form. And yet, the company that just published my textbook tells me that about 90 percent of students who buy my book choose to lug around the four-pound paper version rather than purchase the weightless e-book.
The information is exactly the same, so why would students opt for the pricier and more cumbersome version? Is the library missing something important about the nature of printed versus electronic books?
Some studies do show that information becomes more securely fixed in people’s minds when they read it from paper than when they read it from the screen (as summarized in this recent blog post). Findings like these may resonate with our subjective experience of reading, and yet still seem puzzling at an intellectual level. This is because we’re used to thinking about reading—or information processing more generally—as the metaphorical equivalent of consuming food. We talk about devouring novels, digesting a report, and absorbing information. If we’re ingesting the same material, whether it’s presented in print or electronically, how can the results be so different?
We tend to think of medicine as being all about pills and potions recommended to us by another person—a doctor. But science is starting to reveal that for many conditions another ingredient could be critical to the success of these drugs, or perhaps even replace them. That ingredient is nothing more than your own mind.
Here are six ways to raid your built-in medicine cabinet.
“I talk to my pills,” says Dan Moerman, an anthropologist at the University of Michigan-Dearborn. “I say, ‘Hey guys, I know you’re going to do a terrific job.’”
That might sound eccentric, but based on what we’ve learned about the placebo effect, there is good reason to think that talking to your pills really can make them do a terrific job. The way we think and feel about medical treatments can dramatically influence how our bodies respond.
Simply believing that a treatment will work may trigger the desired effect even if the treatment is inert—a sugar pill, say, or a saline injection. For a wide range of conditions, from depression to Parkinson’s, osteoarthritis and multiple sclerosis, it is clear that the placebo response is far from imaginary. Trials have shown measurable changes such as the release of natural painkillers, altered neuronal firing patterns, lowered blood pressure or heart rate and boosted immune response, all depending on the beliefs of the patient.
It has always been assumed that the placebo effect only works if people are conned into believing that they are getting an actual active drug. But now it seems this may not be true. Belief in the placebo effect itself—rather than a particular drug—might be enough to encourage our bodies to heal.
In the South Pacific there is a place so remote that few people have ever heard of it, let alone seen it: the Trobriand Islands. The Trobriands are located off the east coast of Papua New Guinea, and no white man had set foot there until the late 1700s. During World War I, however, the islands were visited by a man who would one day become a legend in the field of anthropology, Bronislaw Malinowski. Malinowski was a stork of a man—thin, pale, and balding—often seen wearing a pith helmet and socks up to his knees. He had terrible eyesight, was a hypochondriac, an insomniac, and on top of it all had a strong fear of the tropics—in particular, an abhorrence of the heat and the sultriness; to cope, he gave himself injections of arsenic.
Malinowski was, nonetheless, a keen observer of humankind. And as he watched the Trobriand Islanders go about their lives, he noticed something odd. When the islanders went fishing their behavior changed, depending on where they fished. When they fished close to shore—where the waters were calm, the fishing was consistent, and the risk of disaster was low—superstitious behavior among them was nearly nonexistent.
But when the fishermen sailed for open seas—where they were far more vulnerable and their prospects far less certain—their behavior shifted. They became very superstitious, often engaging in elaborate rituals to ensure success. In other words, a low sense of control had produced a high need for superstition. One, in effect, substituted for the other.
This post was originally published at The American Scholar.
Gestures are simple enough. Right? A spontaneous but well-timed wave can emphasize an idea, brush aside a compliment, or point out a barely obscured bird’s nest to an obtuse friend. We use gestures to help our listeners follow along, and we make ourselves look warm and magnanimous in the process.
But on the other hand—and when you’re talking about hands, the puns come furiously—sometimes gestures seem to have nothing to do with edifying or impressing. We gesture on the phone, and in the dark, and when we talk to ourselves. Blind people gesture to other blind people. A growing body of research suggests that we gesture in part for the cognitive benefits that it affords us. Tell us not to use the letter r, or challenge us to adopt a more obscure vocabulary, and our gesture use jumps.
Some people call left-handers southpaws. Others call them mollydookers or corky dobbers. Scientists still often call lefties sinister, which in Latin originally just meant “left” but later came to be associated with evil.
Wondering about the medical implications of being born a corky dobber? It may surprise you that left-handed women were found to be twice or more likely to develop premenopausal breast cancer than right-handers. And a few researchers believe this effect may be linked to exposure to certain chemicals in utero, affecting your genes and then setting the stage for both left-handedness and cancer susceptibility, thus opening up another probability of nurture changing nature.
When it comes to our hands, feet, and even our eyes, most human beings are right-side dominant. Now, you might think that footedness and handedness are always aligned, but as it turns out that’s not always the case for right-handed people, and it’s even more infrequent for left-handed people. Lots of people aren’t congruent.
In board sports, being left-foot dominant is termed goofy – a goofy-footed surfer stands with her left foot on the back of board instead of her right. There are an amazing number of theories as to why some of us are goofy-footed. But the term itself is often said to have originated with an eight-minute long Walt Disney animated short, called Hawaiian Holiday, that was first released to theaters in 1937. The color cartoon stars the usual suspects: Mickey and Minnie, Pluto and Donald, and, of course, Goofy. During the gang’s vacation in Hawaii, Goofy attempts to surf, and when he finally catches a wave and heads back to shore atop its short-lived crest, he’s standing with his right foot forward and his left foot back.
If you’re wondering if you might be goofy and would like to find out before hitting the beach, then imagine yourself at the bottom of a staircase that you’re about to ascend. Which foot moves first? If you’re taking that first imaginary step with your left foot, then it’s likely that you’re a member of the goofy-footed club. And if you find out that you aren’t goofy, then you’re in the majority.
A version of this article originally appeared at The Conversation.
There could be a way of predicting – and preventing – which children will go on to have low intelligence, according to the findings of a study researchers at Cardiff University presented on Monday. They discovered that children with two copies of a common gene (Thr92Ala), together with low levels of thyroid hormone are four times more likely to have a low IQ. This combination occurs in about 4% of the UK population.
Importantly, if you had just one of these factors, but not both, there did not appear to be an increased risk of low intelligence. These are early results, but suggest that it might be possible to treat children early with thyroid hormone supplementation to enhance their intelligence. This raises many ethical issues.
A common objection is that being smarter does not make your life better. In this study, researchers were concerned with those with an IQ between 70-85. Below 70 is classified as intellectual disability but an IQ of 70 to 75 is similar to mild intellectual disability.
Even for individuals with an IQ between 75 and 90 there are still significant disadvantages. Job opportunities tend to be the least desirable and least financially rewarding, requiring significant oversight. More than half the people with this IQ level fail to reach the minimum recruitment standards for the US military. Individuals with this lower level of intelligence are at significant risk of living in poverty (16%), being a chronic welfare dependent (17%) and dropping out of school (35%) compared to individuals with average intelligence. Studies show that they also face an increased risk of incarceration and being murdered.
Linda Gottfredson, who’s undertaken much of this research, concludes that at the very least, “an IQ of 75 is perhaps the most important threshold in modern life”. So it is clear that those of low-normal intelligence, although not classified as disabled, are significantly disadvantaged.
If we could enhance their intelligence, say with thyroid hormone supplementation, we should.