By Wind Goodfriend
This article originally appeared on Dr. Goodfriend’s blog “A Psychologist at the Movies.”
I’m completely obsessed with The Hunger Games. I’m not sure why. Maybe it’s because I have visited North Korea, a real country where millions of people really are dying of hunger. Maybe it’s the ironic meta-experience of watching the movie’s violence on a huge screen, when the movie’s point is that people shouldn’t watch violence on a huge screen. Regardless, The Hunger Games is chock-full of possible psychological analysis. Today I’m focusing on the fascinatingly weird emotions that spark between the The Hunger Games’ two main protagonists, Peeta and Katniss.
At home, Katniss has a boyfriend, a young man named Gale. He has rugged good looks, he’s brave, and they are perfectly matched in many ways. Both Katniss and Gale fight against the system in their own way (which is increasingly seen as the trilogy continues), and he is always successful at making Katniss feel comforted in a world with no comforts.
So why does Katniss later fall for Peeta? Peeta certainly has lovable qualities – he’s smart, nurturing, and can frost a cake like nobody’s business – but he and Katniss are not exactly a natural pair. Their personalities clash, their goals in life are different, and Katniss really isn’t interested in any kind of frivolous romance. Sure, in the first movie she is ambivalent about her feelings for Peeta, the kind-hearted boy with a sexy baby-faced look. But psychology would have predicted their blossoming feelings for each other due to their experiences together in the Hunger Games. It’s all because of a phenomenon called misattribution of arousal. Read More
Most of what I know isn’t in my head. It’s out there in my books. I know how to do a lot of integrals in calculus, for example. But, really, what I mean by that is that I know where my book of integrals is, and I know where in the book any particular method is. I know all that stuff in all those books in my house because I can find my way there.
Books in a bookshelf possess lots of visual cues, so I can quickly find my way to the right book — “Oh, it’s on the bottom left of the shelf by the window in the living room, just below that big blue art book.”
And once I find the book, when I open it up I can use visual cues within it to find my way to the right page. After all, it’s not as if I remember the page number. No, I remember roughly where it is in the book, roughly what the page looks like, and roughly what the surrounding pages might look like. Pages in a book might not initially seem to have a look, but they very often do. There are often figures, or tables, or unique and recognizable features to the way the paragraphs are aligned. These visuo-spatial cues guide me further and further along to the goal, the piece of my knowledge out there in my library.
Mess with my library and books, and you mess with my brain.
By Ben Thomas
The first rat pressed a lever, anticipating the tasty reward it’d been trained to expect. An implant in the rat’s brain converted its neural activity into an electronic signal and beamed the impulse to the brain of the second rat, which leaped forward and pressed a lever in its own cage. But rat #2 had never been trained to press the lever. Its movement impulse came not from its own brain, but directly from the brain of rat #1 – despite the fact that the two were separated by thousands of miles.
What we have created, said lead researcher Miguel Nicolelis, is “a new central nervous system made of two brains.”
That advance happened in 2012, and other labs were quick to one-up Nicolelis and his team. In the summer of 2013, a team of Harvard University researchers engineered a brain-to-brain interface between a rat and a human, enabling the human to control the rat’s tail movements simply by willing them to happen.
Finally, in August 2013, University of Washington scientists Rajesh Rao and Andrea Stocco succeeded in making one leap everyone was waiting for: A human-to-human brain-to-brain interface. By strapping one person into a non-invasive EEG helmet, and strapping the second into a transcranial magnetic stimulation (TMS) helmet, the researchers mind-melded themselves – for the sake of science.
By Matthew D. Lieberman
Comedian Jerry Seinfeld used to tell the following joke: “According to most studies, people’s number one fear is public speaking. Death is number two. Does this sound right? This means to the average person, if you go to a funeral, you’re better-off in the casket than doing the eulogy.”
The joke is a riff based on a privately conducted survey of 2,500 people in 1973 in which 41 percent of respondents indicated that they feared public speaking and only 19 percent indicated that they feared death. While this improbable ordering has not been replicated in most other surveys, public speaking is typically high on the list of our deepest fears. “Top ten” lists of our fears usually fall into three categories: things associated with great physical harm or death, the death or loss of loved ones, and speaking in public.
What is curious is that the person speaking probably doesn’t know or care about most of the people there. So why does it matter so much what they think? The answer is that it hurts to be rejected.
Ask yourself what have been the one or two most painful experiences of your life. Did you think of the physical pain of a broken leg or a really bad fall? My guess is that at least one of your most painful experiences involved what we might call social pain—pain of a loved one’s dying, of being dumped by someone you loved, or of experiencing some kind of public humiliation in front of others.
Why do we associate such events with the word pain? When human beings experience threats or damage to their social bonds, the brain responds in much the same way it responds to physical pain.
By Gina Perry
It’s one of the most well-known psychology experiments in history – the 1961 tests in which social psychologist Stanley Milgram invited volunteers to take part in a study about memory and learning. Its actual aim, though, was to investigate obedience to authority – and Milgram reported that fully 65 percent of volunteers had repeatedly administered increasing electric shocks to a man they believed to be in severe pain.
In the decades since, the results have been held up as proof of the depths of ordinary people’s depravity in service to an authority figure. At the time, this had deep and resonant connections to the Holocaust and Nazi Germany – so resonant, in fact, that they might have led Milgram to dramatically misrepresent his hallmark findings.
Stanley Milgram framed his research from the get-go as both inspired by and an explanation of Nazi behavior. He mentioned the gas chambers in the opening paragraph of his first published article; he strengthened the link and made it more explicit twelve years later in his book, Obedience to Authority.
At the time Milgram’s research was first published, the trial of high profile Nazi Adolph Eichmann was still fresh in the public mind. Eichmann had been captured in Buenos Aires and smuggled out of the country to stand trial in Israel. The trial was the first of its kind to be televised.
By Richard H. Smith
Excerpted from THE JOY OF PAIN: Schadenfreude and the Dark Side of Human Nature
The editors of popular tabloid magazines such as The National Enquirer would appreciate the observations of Edmund Burke, the 18th-century philosopher and statesman. He suggested that theatergoers anticipating a tragic performance on the stage would quickly lose interest and empty themselves from the theater if they heard that a criminal was just about to be executed outside in a nearby square. Burke believed that people have “a degree of delight, and that no small one, in the real misfortunes and pains of others.” Moreover, in his view, real misfortune probably trumps the “imitative arts” every time.
Some have taken this way of thinking even further. In their recent biography of Mao Tse-tung, Mao: The Unknown Story, Jung Chang and Jon Halliday make a persuasive case that Mao was someone who took a special joy “in upheaval and destruction.” But Mao also believed that he was not alone in this preference. For instance, he claimed that most people would choose war over perpetual harmony:
Long-lasting peace is unendurable to human beings, and tidal waves of disturbance have to be created in this state of peace…When we look at history, we adore the times of [war] when dramas happened one after another…which make reading about them great fun. When we get to the periods of peace and prosperity, we are bored.
Still others, such as Walker Percy, have also claimed that people have a pleasure-linked fascination with disasters and calamity, at least when these things are happening to other people. The appeal of the tabloid press and the heavy coverage of crime, accidents, and natural disasters in the media testify to the validity of such claims.
By Ben Thomas
Introversion, it seems, is the Internet’s current meme du jour. Articles on introverts are nothing new, of course—The Atlantic’s 2003 classic “Caring for Your Introvert” still gets passed around Facebook on a regular basis—but the topic has gained some sort of strange critical mass in the past few weeks, and has been popping up everywhere from Gawker to Forbes.
This latest swarm of articles ranges from glorified personality quizzes (31 Unmistakable Signs That You’re An Introvert”) to history lessons (“16 Outrageously Successful Introverts”) to business essays (“Why Introverts Can Make Excellent Executives”) to silly, self-aware send-ups of the trend itself (“15 Unmistakable, Outrageously Secret Signs You’re an Extrovert”). The vast majority of them also come packaged with the assumption the reader understands the basic concept of introversion, and already has a pretty clear idea of whether he or she is an introvert or an extrovert.
Scroll through the comments sections, though, and you’ll find that quite a few readers—even introverted ones—don’t appreciate being put in a labeled box. For every grateful response from a self-professed introvert, you’ll find several responses along the lines of, “No one is always extroverted and no one is always introverted,” and, “I consider myself an extrovert but a lot of these introvert traits apply to me.”
What does neuroscience have to say about all this? Do the brains of introverted people really look and behave differently from those of extroverts? And if so, what might those differences mean?
By Julian De Freitas
Imagine what it would be like to take in everything about this moment. Not only would you be aware of these words on the screen before you, but also of the location of nearby objects with centimeter precision, of the feeling of your toes in your shoes, of every creak, crack and squeak. You would be able to focus on thousands of memories at once, and track every one of your changing emotions no matter what else you were busy with.
Needless to say, your mind would overload with all this information.
So why are you still functioning? Because you have the ability to selectively process only a subset of the information-rich world at a time—that is, the ability to pay attention. Attention is the gatekeeper that “decides” what the mind processes and what it ignores. Yet we typically pay attention so naturally that we seldom pause to consider exactly how our minds accomplish this feat.
by Ben Thomas
When’s the last time you forgot your cell phone? What about your anniversary? We’ve all wished for a better memory at some point. And in those moments, what we’re generally referring to are our fact-based memory systems—the ones that store experiences and learned knowledge, the ones that can be strengthened by flash cards and mnemonic devices.
But there’s another type of memory that’s equally important, though less talked about: Working memory. Working memory holds items in the current moment, like digits in a phone number or lyrics to a song. It’s the mental capability that lets you keep multiple things in mind—say, competing goals for a project at work—and navigate a solution.
And it turns out working memory is fundamental to some of our most important cognitive abilities. People with a spacious working memory are, on average, better test-takers and better students, and they have more executive control and better problem-solving skills. A strong working memory is even correlated with greater lifetime earning potential.
Until the late 1990s, many scientists thought working memory was a static quality like IQ. But in recent years, neuroscience has revealed that this core aspect of cognition can actually be bolstered with behavioral training—and the means might be as simple as an exercise in counting cards.
By Julie Sedivy
Is the film industry guilty of lowballing the intelligence of its audience? It’s not hard to find bloggers, critics and movie insiders (including actor Colin Firth) who think so. A common criticism is that Hollywood seems to believe that viewers are bereft of any creative thought or imagination, and simply want to ingest a pasty mush of cozy clichés, simplistic story lines and cartoon characters. Audiences, the complaint goes, simply aren’t being asked to do any work. This criticism implies that being made to do some mental work is a vital part of what makes a movie rewarding and pleasurable.
Film critic Katherine Monk clearly buys into this view, but offers an original slant: in a recent article for the Vancouver Sun, she blames sophisticated visual effects technology for what she argues is the growing trend to treat viewers as passive sets of eyeballs detached from human imaginations. The problem, she writes, is that current technology has gotten too good at depicting reality, robbing us of the opportunity to construct our own with whatever materials the movie is able to offer.