We tend to think of medicine as being all about pills and potions recommended to us by another person—a doctor. But science is starting to reveal that for many conditions another ingredient could be critical to the success of these drugs, or perhaps even replace them. That ingredient is nothing more than your own mind.
Here are six ways to raid your built-in medicine cabinet.
“I talk to my pills,” says Dan Moerman, an anthropologist at the University of Michigan-Dearborn. “I say, ‘Hey guys, I know you’re going to do a terrific job.’”
That might sound eccentric, but based on what we’ve learned about the placebo effect, there is good reason to think that talking to your pills really can make them do a terrific job. The way we think and feel about medical treatments can dramatically influence how our bodies respond.
Simply believing that a treatment will work may trigger the desired effect even if the treatment is inert—a sugar pill, say, or a saline injection. For a wide range of conditions, from depression to Parkinson’s, osteoarthritis and multiple sclerosis, it is clear that the placebo response is far from imaginary. Trials have shown measurable changes such as the release of natural painkillers, altered neuronal firing patterns, lowered blood pressure or heart rate and boosted immune response, all depending on the beliefs of the patient.
It has always been assumed that the placebo effect only works if people are conned into believing that they are getting an actual active drug. But now it seems this may not be true. Belief in the placebo effect itself—rather than a particular drug—might be enough to encourage our bodies to heal.
In the South Pacific there is a place so remote that few people have ever heard of it, let alone seen it: the Trobriand Islands. The Trobriands are located off the east coast of Papua New Guinea, and no white man had set foot there until the late 1700s. During World War I, however, the islands were visited by a man who would one day become a legend in the field of anthropology, Bronislaw Malinowski. Malinowski was a stork of a man—thin, pale, and balding—often seen wearing a pith helmet and socks up to his knees. He had terrible eyesight, was a hypochondriac, an insomniac, and on top of it all had a strong fear of the tropics—in particular, an abhorrence of the heat and the sultriness; to cope, he gave himself injections of arsenic.
Malinowski was, nonetheless, a keen observer of humankind. And as he watched the Trobriand Islanders go about their lives, he noticed something odd. When the islanders went fishing their behavior changed, depending on where they fished. When they fished close to shore—where the waters were calm, the fishing was consistent, and the risk of disaster was low—superstitious behavior among them was nearly nonexistent.
But when the fishermen sailed for open seas—where they were far more vulnerable and their prospects far less certain—their behavior shifted. They became very superstitious, often engaging in elaborate rituals to ensure success. In other words, a low sense of control had produced a high need for superstition. One, in effect, substituted for the other.
This post was originally published at The American Scholar.
Gestures are simple enough. Right? A spontaneous but well-timed wave can emphasize an idea, brush aside a compliment, or point out a barely obscured bird’s nest to an obtuse friend. We use gestures to help our listeners follow along, and we make ourselves look warm and magnanimous in the process.
But on the other hand—and when you’re talking about hands, the puns come furiously—sometimes gestures seem to have nothing to do with edifying or impressing. We gesture on the phone, and in the dark, and when we talk to ourselves. Blind people gesture to other blind people. A growing body of research suggests that we gesture in part for the cognitive benefits that it affords us. Tell us not to use the letter r, or challenge us to adopt a more obscure vocabulary, and our gesture use jumps.
This article was originally published on The Conversation.
Most office workers send dozens of electronic communications to colleagues in any given working day, through email, instant messaging and intranet systems. So many in fact that you might not notice subtle changes in the language your fellow employees use.
Instead of ending their email with “See ya!”, they might suddenly offer you “Kind regards.” Instead of talking about “us,” they might refer to themselves more. Would you pick up on it if they did?
These changes are important and could hint at a disgruntled employee about to go rogue. Our findings demonstrate how language may provide an indirect way of identifying employees who are undertaking an insider attack.
My team has tested whether it’s possible to detect insider threats within a company just by looking at how employees communicate with each other. If a person is planning to act maliciously to damage their employer or sneak out commercially sensitive material, the way they interact with their co-workers changes.
By Samantha Joel, University of Toronto
People tend to see their own lifestyle as being the ideal lifestyle. A single person may question why anyone would choose to shackle themselves to one partner rather than live it up with the single life. Then there is that smug married couple who pushes for other couples to also tie the knot, so they can similarly bask in wedded bliss.
This phenomenon is called “normative idealization”, which is the tendency to idealize one’s own lifestyle and believe others would benefit from it too.
Where does such insufferable behavior come from? It has been suggested that people might idealize their own relationship status not because they are actually confident that it is ideal, but rather because they are trying to feel better about their own lives.
Psychologists at Stanford University and the University of Waterloo tested whether people were more judgmental of others’ lifestyles when they felt threatened regarding their own. Their results are published in the journal Psychological Science.
By Wind Goodfriend
This article originally appeared on Dr. Goodfriend’s blog “A Psychologist at the Movies.”
I’m completely obsessed with The Hunger Games. I’m not sure why. Maybe it’s because I have visited North Korea, a real country where millions of people really are dying of hunger. Maybe it’s the ironic meta-experience of watching the movie’s violence on a huge screen, when the movie’s point is that people shouldn’t watch violence on a huge screen. Regardless, The Hunger Games is chock-full of possible psychological analysis. Today I’m focusing on the fascinatingly weird emotions that spark between the The Hunger Games’ two main protagonists, Peeta and Katniss.
At home, Katniss has a boyfriend, a young man named Gale. He has rugged good looks, he’s brave, and they are perfectly matched in many ways. Both Katniss and Gale fight against the system in their own way (which is increasingly seen as the trilogy continues), and he is always successful at making Katniss feel comforted in a world with no comforts.
So why does Katniss later fall for Peeta? Peeta certainly has lovable qualities – he’s smart, nurturing, and can frost a cake like nobody’s business – but he and Katniss are not exactly a natural pair. Their personalities clash, their goals in life are different, and Katniss really isn’t interested in any kind of frivolous romance. Sure, in the first movie she is ambivalent about her feelings for Peeta, the kind-hearted boy with a sexy baby-faced look. But psychology would have predicted their blossoming feelings for each other due to their experiences together in the Hunger Games. It’s all because of a phenomenon called misattribution of arousal. Read More
By Matthew D. Lieberman
Comedian Jerry Seinfeld used to tell the following joke: “According to most studies, people’s number one fear is public speaking. Death is number two. Does this sound right? This means to the average person, if you go to a funeral, you’re better-off in the casket than doing the eulogy.”
The joke is a riff based on a privately conducted survey of 2,500 people in 1973 in which 41 percent of respondents indicated that they feared public speaking and only 19 percent indicated that they feared death. While this improbable ordering has not been replicated in most other surveys, public speaking is typically high on the list of our deepest fears. “Top ten” lists of our fears usually fall into three categories: things associated with great physical harm or death, the death or loss of loved ones, and speaking in public.
What is curious is that the person speaking probably doesn’t know or care about most of the people there. So why does it matter so much what they think? The answer is that it hurts to be rejected.
Ask yourself what have been the one or two most painful experiences of your life. Did you think of the physical pain of a broken leg or a really bad fall? My guess is that at least one of your most painful experiences involved what we might call social pain—pain of a loved one’s dying, of being dumped by someone you loved, or of experiencing some kind of public humiliation in front of others.
Why do we associate such events with the word pain? When human beings experience threats or damage to their social bonds, the brain responds in much the same way it responds to physical pain.
By Gina Perry
It’s one of the most well-known psychology experiments in history – the 1961 tests in which social psychologist Stanley Milgram invited volunteers to take part in a study about memory and learning. Its actual aim, though, was to investigate obedience to authority – and Milgram reported that fully 65 percent of volunteers had repeatedly administered increasing electric shocks to a man they believed to be in severe pain.
In the decades since, the results have been held up as proof of the depths of ordinary people’s depravity in service to an authority figure. At the time, this had deep and resonant connections to the Holocaust and Nazi Germany – so resonant, in fact, that they might have led Milgram to dramatically misrepresent his hallmark findings.
Stanley Milgram framed his research from the get-go as both inspired by and an explanation of Nazi behavior. He mentioned the gas chambers in the opening paragraph of his first published article; he strengthened the link and made it more explicit twelve years later in his book, Obedience to Authority.
At the time Milgram’s research was first published, the trial of high profile Nazi Adolph Eichmann was still fresh in the public mind. Eichmann had been captured in Buenos Aires and smuggled out of the country to stand trial in Israel. The trial was the first of its kind to be televised.
Andrew Grant is an associate editor at DISCOVER. His latest feature, “William Borucki: Planet Hunter,” appears in the December issue of the magazine.
Last night Major League Baseball announced the winners of the Cy Young Award, given to the year’s best pitchers in the American and National leagues. The National League victor was New York Mets pitcher R.A. Dickey. That he won the award is remarkable, and not just because he is a relatively ancient 38 years old or because he plays for the perennial punch line Mets. Dickey is the first Cy Young winner whose repertoire consists primarily of the knuckleball, a baffling pitch whose intricacies scientists are only now beginning to understand.
Most pitchers, including the other Cy Young finalists, try to overwhelm hitters with a combination of speed and movement. They throw the ball hard—the average major league fastball zooms in at around 91 miles per hour—and generate spin (up to 50 rotations a second) that makes the ball break, or deviate from a straight-line trajectory. Dickey does neither of those things. Rather than cock his arm back and fire, he pushes the ball like a dart so that it floats toward the plate between 55 and 80 mph. The ball barely spins at all—perhaps a quarter- or half-turn before reaching the hitter.
Keith Kloor is a freelance journalist whose stories have appeared in a range of publications, from Science to Smithsonian. Since 2004, he’s been an adjunct professor of journalism at New York University. You can find him on Twitter @KeithKloor.
Last month, a group of Massachusetts residents filed an official complaint claiming that the wind turbine in their town is making them sick. According to the article in the Patriot Ledger, the residents “said they’ve lost sleep and suffered headaches, dizziness and nausea as a result of the turbine’s noise and shadow flicker [flashing caused by shadows from moving turbine blades].” A few weeks later, a story from Wisconsin highlighted similar complaints of health problems associated with wind turbines there.
Anecdotal claims like these are on the rise and not just in the United States. A recent story in the UK’s Daily Mail catalogs a litany of health ailments supposedly caused by wind turbines—everything from memory loss and dizziness to tinnitus and depression.
I expect so. For one thing, the alleged health problem has been adopted by demagogues and parroted on popular climate-skeptic websites. But the bigger problem is that “wind turbine syndrome” is what is known as a “communicated” disease, says Simon Chapman, a professor of public health at the University of Sydney. The disease, which has reached epidemic proportions in Australia, “spreads via the nocebo effect by being talked about, and is thereby a strong candidate for being defined as a psychogenic condition,” Chapman wrote several months ago in The Conversation.
What Chapman is describing is a phenomenon akin to mass hysteria—an outbreak of apparent health problems that has a psychological rather than physical basis. Such episodes have occurred throughout human history; earlier this year, a cluster of teenagers at an upstate New York high school were suddenly afflicted with Tourette syndrome-like symptoms. The mystery outbreak was attributed by some speculation to environmental contaminants.
But a doctor treating many of the students instead diagnosed them with a psychological condition called “conversion disorder,” as described by psychologist Vaughan Bell on The Crux: