This post was originally published at The American Scholar.
Gestures are simple enough. Right? A spontaneous but well-timed wave can emphasize an idea, brush aside a compliment, or point out a barely obscured bird’s nest to an obtuse friend. We use gestures to help our listeners follow along, and we make ourselves look warm and magnanimous in the process.
But on the other hand—and when you’re talking about hands, the puns come furiously—sometimes gestures seem to have nothing to do with edifying or impressing. We gesture on the phone, and in the dark, and when we talk to ourselves. Blind people gesture to other blind people. A growing body of research suggests that we gesture in part for the cognitive benefits that it affords us. Tell us not to use the letter r, or challenge us to adopt a more obscure vocabulary, and our gesture use jumps.
Some people call left-handers southpaws. Others call them mollydookers or corky dobbers. Scientists still often call lefties sinister, which in Latin originally just meant “left” but later came to be associated with evil.
Wondering about the medical implications of being born a corky dobber? It may surprise you that left-handed women were found to be twice or more likely to develop premenopausal breast cancer than right-handers. And a few researchers believe this effect may be linked to exposure to certain chemicals in utero, affecting your genes and then setting the stage for both left-handedness and cancer susceptibility, thus opening up another probability of nurture changing nature.
When it comes to our hands, feet, and even our eyes, most human beings are right-side dominant. Now, you might think that footedness and handedness are always aligned, but as it turns out that’s not always the case for right-handed people, and it’s even more infrequent for left-handed people. Lots of people aren’t congruent.
In board sports, being left-foot dominant is termed goofy – a goofy-footed surfer stands with her left foot on the back of board instead of her right. There are an amazing number of theories as to why some of us are goofy-footed. But the term itself is often said to have originated with an eight-minute long Walt Disney animated short, called Hawaiian Holiday, that was first released to theaters in 1937. The color cartoon stars the usual suspects: Mickey and Minnie, Pluto and Donald, and, of course, Goofy. During the gang’s vacation in Hawaii, Goofy attempts to surf, and when he finally catches a wave and heads back to shore atop its short-lived crest, he’s standing with his right foot forward and his left foot back.
If you’re wondering if you might be goofy and would like to find out before hitting the beach, then imagine yourself at the bottom of a staircase that you’re about to ascend. Which foot moves first? If you’re taking that first imaginary step with your left foot, then it’s likely that you’re a member of the goofy-footed club. And if you find out that you aren’t goofy, then you’re in the majority.
A version of this article originally appeared at The Conversation.
There could be a way of predicting – and preventing – which children will go on to have low intelligence, according to the findings of a study researchers at Cardiff University presented on Monday. They discovered that children with two copies of a common gene (Thr92Ala), together with low levels of thyroid hormone are four times more likely to have a low IQ. This combination occurs in about 4% of the UK population.
Importantly, if you had just one of these factors, but not both, there did not appear to be an increased risk of low intelligence. These are early results, but suggest that it might be possible to treat children early with thyroid hormone supplementation to enhance their intelligence. This raises many ethical issues.
A common objection is that being smarter does not make your life better. In this study, researchers were concerned with those with an IQ between 70-85. Below 70 is classified as intellectual disability but an IQ of 70 to 75 is similar to mild intellectual disability.
Even for individuals with an IQ between 75 and 90 there are still significant disadvantages. Job opportunities tend to be the least desirable and least financially rewarding, requiring significant oversight. More than half the people with this IQ level fail to reach the minimum recruitment standards for the US military. Individuals with this lower level of intelligence are at significant risk of living in poverty (16%), being a chronic welfare dependent (17%) and dropping out of school (35%) compared to individuals with average intelligence. Studies show that they also face an increased risk of incarceration and being murdered.
Linda Gottfredson, who’s undertaken much of this research, concludes that at the very least, “an IQ of 75 is perhaps the most important threshold in modern life”. So it is clear that those of low-normal intelligence, although not classified as disabled, are significantly disadvantaged.
If we could enhance their intelligence, say with thyroid hormone supplementation, we should.
This article was originally published on The Conversation.
Most office workers send dozens of electronic communications to colleagues in any given working day, through email, instant messaging and intranet systems. So many in fact that you might not notice subtle changes in the language your fellow employees use.
Instead of ending their email with “See ya!”, they might suddenly offer you “Kind regards.” Instead of talking about “us,” they might refer to themselves more. Would you pick up on it if they did?
These changes are important and could hint at a disgruntled employee about to go rogue. Our findings demonstrate how language may provide an indirect way of identifying employees who are undertaking an insider attack.
My team has tested whether it’s possible to detect insider threats within a company just by looking at how employees communicate with each other. If a person is planning to act maliciously to damage their employer or sneak out commercially sensitive material, the way they interact with their co-workers changes.
By Samantha Joel, University of Toronto
People tend to see their own lifestyle as being the ideal lifestyle. A single person may question why anyone would choose to shackle themselves to one partner rather than live it up with the single life. Then there is that smug married couple who pushes for other couples to also tie the knot, so they can similarly bask in wedded bliss.
This phenomenon is called “normative idealization”, which is the tendency to idealize one’s own lifestyle and believe others would benefit from it too.
Where does such insufferable behavior come from? It has been suggested that people might idealize their own relationship status not because they are actually confident that it is ideal, but rather because they are trying to feel better about their own lives.
Psychologists at Stanford University and the University of Waterloo tested whether people were more judgmental of others’ lifestyles when they felt threatened regarding their own. Their results are published in the journal Psychological Science.
On July 11th 1998, my life was ominously transformed by an encounter with the once-familiar subjects of my research. Having been hired by the University of Wyoming a decade earlier to study the ecology and management of rangeland grasshoppers, I thought that I pretty much knew these insects.
I had spent that fateful morning gathering data from research plots. A week earlier, my field crew reported that to the north, where deep draws were etched into the prairie, the grasshoppers were reaching biblical proportions. I decided to see for myself.
The earthen banks rose above my head as I descended into the gulch, where the insects had massed into a bristling carpet of wings and legs. My arrival incited pandemonium. Grasshoppers ricocheted off my face, tangled their spiny legs into my hair, and began to crawl into the gaps between shirt buttons.
By Wind Goodfriend
This article originally appeared on Dr. Goodfriend’s blog “A Psychologist at the Movies.”
I’m completely obsessed with The Hunger Games. I’m not sure why. Maybe it’s because I have visited North Korea, a real country where millions of people really are dying of hunger. Maybe it’s the ironic meta-experience of watching the movie’s violence on a huge screen, when the movie’s point is that people shouldn’t watch violence on a huge screen. Regardless, The Hunger Games is chock-full of possible psychological analysis. Today I’m focusing on the fascinatingly weird emotions that spark between the The Hunger Games’ two main protagonists, Peeta and Katniss.
At home, Katniss has a boyfriend, a young man named Gale. He has rugged good looks, he’s brave, and they are perfectly matched in many ways. Both Katniss and Gale fight against the system in their own way (which is increasingly seen as the trilogy continues), and he is always successful at making Katniss feel comforted in a world with no comforts.
So why does Katniss later fall for Peeta? Peeta certainly has lovable qualities – he’s smart, nurturing, and can frost a cake like nobody’s business – but he and Katniss are not exactly a natural pair. Their personalities clash, their goals in life are different, and Katniss really isn’t interested in any kind of frivolous romance. Sure, in the first movie she is ambivalent about her feelings for Peeta, the kind-hearted boy with a sexy baby-faced look. But psychology would have predicted their blossoming feelings for each other due to their experiences together in the Hunger Games. It’s all because of a phenomenon called misattribution of arousal. Read More
Most of what I know isn’t in my head. It’s out there in my books. I know how to do a lot of integrals in calculus, for example. But, really, what I mean by that is that I know where my book of integrals is, and I know where in the book any particular method is. I know all that stuff in all those books in my house because I can find my way there.
Books in a bookshelf possess lots of visual cues, so I can quickly find my way to the right book — “Oh, it’s on the bottom left of the shelf by the window in the living room, just below that big blue art book.”
And once I find the book, when I open it up I can use visual cues within it to find my way to the right page. After all, it’s not as if I remember the page number. No, I remember roughly where it is in the book, roughly what the page looks like, and roughly what the surrounding pages might look like. Pages in a book might not initially seem to have a look, but they very often do. There are often figures, or tables, or unique and recognizable features to the way the paragraphs are aligned. These visuo-spatial cues guide me further and further along to the goal, the piece of my knowledge out there in my library.
Mess with my library and books, and you mess with my brain.
By Ben Thomas
The first rat pressed a lever, anticipating the tasty reward it’d been trained to expect. An implant in the rat’s brain converted its neural activity into an electronic signal and beamed the impulse to the brain of the second rat, which leaped forward and pressed a lever in its own cage. But rat #2 had never been trained to press the lever. Its movement impulse came not from its own brain, but directly from the brain of rat #1 – despite the fact that the two were separated by thousands of miles.
What we have created, said lead researcher Miguel Nicolelis, is “a new central nervous system made of two brains.”
That advance happened in 2012, and other labs were quick to one-up Nicolelis and his team. In the summer of 2013, a team of Harvard University researchers engineered a brain-to-brain interface between a rat and a human, enabling the human to control the rat’s tail movements simply by willing them to happen.
Finally, in August 2013, University of Washington scientists Rajesh Rao and Andrea Stocco succeeded in making one leap everyone was waiting for: A human-to-human brain-to-brain interface. By strapping one person into a non-invasive EEG helmet, and strapping the second into a transcranial magnetic stimulation (TMS) helmet, the researchers mind-melded themselves – for the sake of science.
By Matthew D. Lieberman
Comedian Jerry Seinfeld used to tell the following joke: “According to most studies, people’s number one fear is public speaking. Death is number two. Does this sound right? This means to the average person, if you go to a funeral, you’re better-off in the casket than doing the eulogy.”
The joke is a riff based on a privately conducted survey of 2,500 people in 1973 in which 41 percent of respondents indicated that they feared public speaking and only 19 percent indicated that they feared death. While this improbable ordering has not been replicated in most other surveys, public speaking is typically high on the list of our deepest fears. “Top ten” lists of our fears usually fall into three categories: things associated with great physical harm or death, the death or loss of loved ones, and speaking in public.
What is curious is that the person speaking probably doesn’t know or care about most of the people there. So why does it matter so much what they think? The answer is that it hurts to be rejected.
Ask yourself what have been the one or two most painful experiences of your life. Did you think of the physical pain of a broken leg or a really bad fall? My guess is that at least one of your most painful experiences involved what we might call social pain—pain of a loved one’s dying, of being dumped by someone you loved, or of experiencing some kind of public humiliation in front of others.
Why do we associate such events with the word pain? When human beings experience threats or damage to their social bonds, the brain responds in much the same way it responds to physical pain.