By Gina Perry
It’s one of the most well-known psychology experiments in history – the 1961 tests in which social psychologist Stanley Milgram invited volunteers to take part in a study about memory and learning. Its actual aim, though, was to investigate obedience to authority – and Milgram reported that fully 65 percent of volunteers had repeatedly administered increasing electric shocks to a man they believed to be in severe pain.
In the decades since, the results have been held up as proof of the depths of ordinary people’s depravity in service to an authority figure. At the time, this had deep and resonant connections to the Holocaust and Nazi Germany – so resonant, in fact, that they might have led Milgram to dramatically misrepresent his hallmark findings.
Stanley Milgram framed his research from the get-go as both inspired by and an explanation of Nazi behavior. He mentioned the gas chambers in the opening paragraph of his first published article; he strengthened the link and made it more explicit twelve years later in his book, Obedience to Authority.
At the time Milgram’s research was first published, the trial of high profile Nazi Adolph Eichmann was still fresh in the public mind. Eichmann had been captured in Buenos Aires and smuggled out of the country to stand trial in Israel. The trial was the first of its kind to be televised.
By Richard H. Smith
Excerpted from THE JOY OF PAIN: Schadenfreude and the Dark Side of Human Nature
The editors of popular tabloid magazines such as The National Enquirer would appreciate the observations of Edmund Burke, the 18th-century philosopher and statesman. He suggested that theatergoers anticipating a tragic performance on the stage would quickly lose interest and empty themselves from the theater if they heard that a criminal was just about to be executed outside in a nearby square. Burke believed that people have “a degree of delight, and that no small one, in the real misfortunes and pains of others.” Moreover, in his view, real misfortune probably trumps the “imitative arts” every time.
Some have taken this way of thinking even further. In their recent biography of Mao Tse-tung, Mao: The Unknown Story, Jung Chang and Jon Halliday make a persuasive case that Mao was someone who took a special joy “in upheaval and destruction.” But Mao also believed that he was not alone in this preference. For instance, he claimed that most people would choose war over perpetual harmony:
Long-lasting peace is unendurable to human beings, and tidal waves of disturbance have to be created in this state of peace…When we look at history, we adore the times of [war] when dramas happened one after another…which make reading about them great fun. When we get to the periods of peace and prosperity, we are bored.
Still others, such as Walker Percy, have also claimed that people have a pleasure-linked fascination with disasters and calamity, at least when these things are happening to other people. The appeal of the tabloid press and the heavy coverage of crime, accidents, and natural disasters in the media testify to the validity of such claims.
By Ben Thomas
Introversion, it seems, is the Internet’s current meme du jour. Articles on introverts are nothing new, of course—The Atlantic’s 2003 classic “Caring for Your Introvert” still gets passed around Facebook on a regular basis—but the topic has gained some sort of strange critical mass in the past few weeks, and has been popping up everywhere from Gawker to Forbes.
This latest swarm of articles ranges from glorified personality quizzes (31 Unmistakable Signs That You’re An Introvert”) to history lessons (“16 Outrageously Successful Introverts”) to business essays (“Why Introverts Can Make Excellent Executives”) to silly, self-aware send-ups of the trend itself (“15 Unmistakable, Outrageously Secret Signs You’re an Extrovert”). The vast majority of them also come packaged with the assumption the reader understands the basic concept of introversion, and already has a pretty clear idea of whether he or she is an introvert or an extrovert.
Scroll through the comments sections, though, and you’ll find that quite a few readers—even introverted ones—don’t appreciate being put in a labeled box. For every grateful response from a self-professed introvert, you’ll find several responses along the lines of, “No one is always extroverted and no one is always introverted,” and, “I consider myself an extrovert but a lot of these introvert traits apply to me.”
What does neuroscience have to say about all this? Do the brains of introverted people really look and behave differently from those of extroverts? And if so, what might those differences mean?
By Julian De Freitas
Imagine what it would be like to take in everything about this moment. Not only would you be aware of these words on the screen before you, but also of the location of nearby objects with centimeter precision, of the feeling of your toes in your shoes, of every creak, crack and squeak. You would be able to focus on thousands of memories at once, and track every one of your changing emotions no matter what else you were busy with.
Needless to say, your mind would overload with all this information.
So why are you still functioning? Because you have the ability to selectively process only a subset of the information-rich world at a time—that is, the ability to pay attention. Attention is the gatekeeper that “decides” what the mind processes and what it ignores. Yet we typically pay attention so naturally that we seldom pause to consider exactly how our minds accomplish this feat.
by Ben Thomas
When’s the last time you forgot your cell phone? What about your anniversary? We’ve all wished for a better memory at some point. And in those moments, what we’re generally referring to are our fact-based memory systems—the ones that store experiences and learned knowledge, the ones that can be strengthened by flash cards and mnemonic devices.
But there’s another type of memory that’s equally important, though less talked about: Working memory. Working memory holds items in the current moment, like digits in a phone number or lyrics to a song. It’s the mental capability that lets you keep multiple things in mind—say, competing goals for a project at work—and navigate a solution.
And it turns out working memory is fundamental to some of our most important cognitive abilities. People with a spacious working memory are, on average, better test-takers and better students, and they have more executive control and better problem-solving skills. A strong working memory is even correlated with greater lifetime earning potential.
Until the late 1990s, many scientists thought working memory was a static quality like IQ. But in recent years, neuroscience has revealed that this core aspect of cognition can actually be bolstered with behavioral training—and the means might be as simple as an exercise in counting cards.
By Julie Sedivy
Is the film industry guilty of lowballing the intelligence of its audience? It’s not hard to find bloggers, critics and movie insiders (including actor Colin Firth) who think so. A common criticism is that Hollywood seems to believe that viewers are bereft of any creative thought or imagination, and simply want to ingest a pasty mush of cozy clichés, simplistic story lines and cartoon characters. Audiences, the complaint goes, simply aren’t being asked to do any work. This criticism implies that being made to do some mental work is a vital part of what makes a movie rewarding and pleasurable.
Film critic Katherine Monk clearly buys into this view, but offers an original slant: in a recent article for the Vancouver Sun, she blames sophisticated visual effects technology for what she argues is the growing trend to treat viewers as passive sets of eyeballs detached from human imaginations. The problem, she writes, is that current technology has gotten too good at depicting reality, robbing us of the opportunity to construct our own with whatever materials the movie is able to offer.
Asked to describe the five senses, most of us can rattle them off without hesitation: sight, sound, smell, taste and touch. But what do those words mean, and do they mean the same thing to every person?
Take sound, for example. Randall Poster, who has worked as a music supervisor on movies ranging from School of Rock and Velvet Goldmine to Moonrise Kingdom, believes the inherent audience experience of a score or soundtrack has changed.
While researching music for the HBO series Boardwalk Empire, Poster unearthed a treasure trove of “photoplay music,” sheet music written for the musicians performing live at local nickelodeons in the silent film era. The titles of photoplay compositions—“In a Merry Mood” and “Agitato Mysterioso,” for example—reveal the emotional response from the audience that the music would provoke. It’s a response, however, that makes assumptions about its audience’s culture.
“Music renders the collective psychology of the moment, of the human condition,” said Poster, speaking at a seminar Monday at the TED2013 conference in Long Beach, Calif. “But what sounded suspenseful in 1920 may not sound suspenseful today.”
By Pete Etchells The Internet is a wonderful, terrifying thing. On the one hand, it gives us instant access to literally all of humanity’s collected knowledge, and connects us to those that we know and love. On the other, it all too often exposes an awful side to people who shroud themselves in anonymity in order to hurt others. When it comes to mental health, this darker reflection of the Internet can cause lots of serious problems. Thankfully though, an increasing number of people are exploiting the positive potential of the web, using digital tools in innovative ways to help both patients and professionals.
One such person is Kathy Griffiths. In the treatment of depression, “historically, peer-to-peer support has not been taken very seriously by professionals,” Griffiths says. She set out to change that in a recent study published in PLOS ONE, in which she and colleagues at the Australian National University in Canberra conducted a randomized controlled trial looking at the use of online support groups for treating depression.
Depressed individuals were assigned to one of four groups: an Internet Support Group, an online self-help program, both, or neither. The researchers found that the virtual support group on its own, and in combination with the web-based training tool, significantly reduced depression symptoms in participants up to a year after they’d taken part.
Julie Sedivy is the lead author of Sold on Language: How Advertisers Talk to You And What This Says About You. She contributes regularly to Psychology Today and Language Log. She is an adjunct professor at the University of Calgary, and can be found at juliesedivy.com and on Twitter/soldonlanguage.
These days, I can’t seem to keep straight which of his friends my son is hanging out with on any given day—was it Jason, Jaden, Hayden, or Aidan? Their names all have a way of blurring together. My confusion reflects a growing trend for American boys’ names to sound more and more alike, according to a recent New York Times piece reporting on data gathered by Laura Wattenberg of BabyNameWizard.com.
It’s not as if the pool of available names is shrinking though. Quite the opposite. A couple of generations ago, parents mostly stuck with a handful of tried-and-true classics (James, Richard, William); the ten most common names were shared by more than a third of boys in 1950. These days, only nine percent of boys sport the ten most common names. But this recent burst of innovation in names shows more restraint than variety when it comes to their sounds. For example, 36 percent of newborn American boys have names that end in “n”, as compared to just 14 percent in 1950.
This might seem paradoxical, but in fact, it’s a fairly typical aspect of name invention (as my co-author Greg Carlson and I have discussed in our book Sold on Language). When creating a new word of any sort, whether it’s a common noun, verb, baby name or even a brand name, there’s a tendency to gravitate toward known sound patterns. Truly original names, like Quatergork, or Ponveen haven’t yet made it into my son’s social circle. Novelty, it seems, thrives best when it’s a variation on the familiar.
Mark Changizi is an evolutionary neurobiologist and director of human cognition at 2AI Labs. He is the author of The Brain from 25000 Feet, The Vision Revolution, and his newest book, Harnessed: How Language and Music Mimicked Nature and Transformed Ape to Man.
The human fascination with color never ceases to amaze me. Our perceptual experience is filled with shapes and pitches and textures and timbres and depths and on and on, yet color seems to get the lion share of our excitement and philosophical attention. Color seems somehow more artistic than our other perceptual dimensions; it’s simply wonderful to behold, as evinced by the double rainbow guy; and we can’t resist wondering what it would be like to see dimensions of color beyond our own. In fact, RadioLab recently put out a great show on color that nicely conveys the romance we all have toward it.
Question is: Why do we find color so enthralling? One of the reasons may be that the world can seem arbitrarily labeled in color, as if a painter dabbed over everything in order to make it beautiful… and that naturally makes us wonder what a different artist might do. What sort of splendor is a bird—who has an extra dimension of color beyond ours—treated to, for example?
While I, too, feel the wonder of color, I don’t share this above intuition about color and its arbitrariness. It’s an unfortunate intuition, one that seeps its way not only into the minds of laymen, but into our “enhancement” products and even the hallowed halls of philosophy. In trying to explain what’s wrong with the intuition, let me begin with a thought experiment concerning a product that gives the wearer “shape enhancement” vision.