By Ben Thomas
The first rat pressed a lever, anticipating the tasty reward it’d been trained to expect. An implant in the rat’s brain converted its neural activity into an electronic signal and beamed the impulse to the brain of the second rat, which leaped forward and pressed a lever in its own cage. But rat #2 had never been trained to press the lever. Its movement impulse came not from its own brain, but directly from the brain of rat #1 – despite the fact that the two were separated by thousands of miles.
What we have created, said lead researcher Miguel Nicolelis, is “a new central nervous system made of two brains.”
That advance happened in 2012, and other labs were quick to one-up Nicolelis and his team. In the summer of 2013, a team of Harvard University researchers engineered a brain-to-brain interface between a rat and a human, enabling the human to control the rat’s tail movements simply by willing them to happen.
Finally, in August 2013, University of Washington scientists Rajesh Rao and Andrea Stocco succeeded in making one leap everyone was waiting for: A human-to-human brain-to-brain interface. By strapping one person into a non-invasive EEG helmet, and strapping the second into a transcranial magnetic stimulation (TMS) helmet, the researchers mind-melded themselves – for the sake of science.
By Matthew D. Lieberman
Comedian Jerry Seinfeld used to tell the following joke: “According to most studies, people’s number one fear is public speaking. Death is number two. Does this sound right? This means to the average person, if you go to a funeral, you’re better-off in the casket than doing the eulogy.”
The joke is a riff based on a privately conducted survey of 2,500 people in 1973 in which 41 percent of respondents indicated that they feared public speaking and only 19 percent indicated that they feared death. While this improbable ordering has not been replicated in most other surveys, public speaking is typically high on the list of our deepest fears. “Top ten” lists of our fears usually fall into three categories: things associated with great physical harm or death, the death or loss of loved ones, and speaking in public.
What is curious is that the person speaking probably doesn’t know or care about most of the people there. So why does it matter so much what they think? The answer is that it hurts to be rejected.
Ask yourself what have been the one or two most painful experiences of your life. Did you think of the physical pain of a broken leg or a really bad fall? My guess is that at least one of your most painful experiences involved what we might call social pain—pain of a loved one’s dying, of being dumped by someone you loved, or of experiencing some kind of public humiliation in front of others.
Why do we associate such events with the word pain? When human beings experience threats or damage to their social bonds, the brain responds in much the same way it responds to physical pain.
By Gina Perry
It’s one of the most well-known psychology experiments in history – the 1961 tests in which social psychologist Stanley Milgram invited volunteers to take part in a study about memory and learning. Its actual aim, though, was to investigate obedience to authority – and Milgram reported that fully 65 percent of volunteers had repeatedly administered increasing electric shocks to a man they believed to be in severe pain.
In the decades since, the results have been held up as proof of the depths of ordinary people’s depravity in service to an authority figure. At the time, this had deep and resonant connections to the Holocaust and Nazi Germany – so resonant, in fact, that they might have led Milgram to dramatically misrepresent his hallmark findings.
Stanley Milgram framed his research from the get-go as both inspired by and an explanation of Nazi behavior. He mentioned the gas chambers in the opening paragraph of his first published article; he strengthened the link and made it more explicit twelve years later in his book, Obedience to Authority.
At the time Milgram’s research was first published, the trial of high profile Nazi Adolph Eichmann was still fresh in the public mind. Eichmann had been captured in Buenos Aires and smuggled out of the country to stand trial in Israel. The trial was the first of its kind to be televised.
By Richard H. Smith
Excerpted from THE JOY OF PAIN: Schadenfreude and the Dark Side of Human Nature
The editors of popular tabloid magazines such as The National Enquirer would appreciate the observations of Edmund Burke, the 18th-century philosopher and statesman. He suggested that theatergoers anticipating a tragic performance on the stage would quickly lose interest and empty themselves from the theater if they heard that a criminal was just about to be executed outside in a nearby square. Burke believed that people have “a degree of delight, and that no small one, in the real misfortunes and pains of others.” Moreover, in his view, real misfortune probably trumps the “imitative arts” every time.
Some have taken this way of thinking even further. In their recent biography of Mao Tse-tung, Mao: The Unknown Story, Jung Chang and Jon Halliday make a persuasive case that Mao was someone who took a special joy “in upheaval and destruction.” But Mao also believed that he was not alone in this preference. For instance, he claimed that most people would choose war over perpetual harmony:
Long-lasting peace is unendurable to human beings, and tidal waves of disturbance have to be created in this state of peace…When we look at history, we adore the times of [war] when dramas happened one after another…which make reading about them great fun. When we get to the periods of peace and prosperity, we are bored.
Still others, such as Walker Percy, have also claimed that people have a pleasure-linked fascination with disasters and calamity, at least when these things are happening to other people. The appeal of the tabloid press and the heavy coverage of crime, accidents, and natural disasters in the media testify to the validity of such claims.
By Ben Thomas
Introversion, it seems, is the Internet’s current meme du jour. Articles on introverts are nothing new, of course—The Atlantic’s 2003 classic “Caring for Your Introvert” still gets passed around Facebook on a regular basis—but the topic has gained some sort of strange critical mass in the past few weeks, and has been popping up everywhere from Gawker to Forbes.
This latest swarm of articles ranges from glorified personality quizzes (31 Unmistakable Signs That You’re An Introvert”) to history lessons (“16 Outrageously Successful Introverts”) to business essays (“Why Introverts Can Make Excellent Executives”) to silly, self-aware send-ups of the trend itself (“15 Unmistakable, Outrageously Secret Signs You’re an Extrovert”). The vast majority of them also come packaged with the assumption the reader understands the basic concept of introversion, and already has a pretty clear idea of whether he or she is an introvert or an extrovert.
Scroll through the comments sections, though, and you’ll find that quite a few readers—even introverted ones—don’t appreciate being put in a labeled box. For every grateful response from a self-professed introvert, you’ll find several responses along the lines of, “No one is always extroverted and no one is always introverted,” and, “I consider myself an extrovert but a lot of these introvert traits apply to me.”
What does neuroscience have to say about all this? Do the brains of introverted people really look and behave differently from those of extroverts? And if so, what might those differences mean?
By Julian De Freitas
Imagine what it would be like to take in everything about this moment. Not only would you be aware of these words on the screen before you, but also of the location of nearby objects with centimeter precision, of the feeling of your toes in your shoes, of every creak, crack and squeak. You would be able to focus on thousands of memories at once, and track every one of your changing emotions no matter what else you were busy with.
Needless to say, your mind would overload with all this information.
So why are you still functioning? Because you have the ability to selectively process only a subset of the information-rich world at a time—that is, the ability to pay attention. Attention is the gatekeeper that “decides” what the mind processes and what it ignores. Yet we typically pay attention so naturally that we seldom pause to consider exactly how our minds accomplish this feat.
by Ben Thomas
When’s the last time you forgot your cell phone? What about your anniversary? We’ve all wished for a better memory at some point. And in those moments, what we’re generally referring to are our fact-based memory systems—the ones that store experiences and learned knowledge, the ones that can be strengthened by flash cards and mnemonic devices.
But there’s another type of memory that’s equally important, though less talked about: Working memory. Working memory holds items in the current moment, like digits in a phone number or lyrics to a song. It’s the mental capability that lets you keep multiple things in mind—say, competing goals for a project at work—and navigate a solution.
And it turns out working memory is fundamental to some of our most important cognitive abilities. People with a spacious working memory are, on average, better test-takers and better students, and they have more executive control and better problem-solving skills. A strong working memory is even correlated with greater lifetime earning potential.
Until the late 1990s, many scientists thought working memory was a static quality like IQ. But in recent years, neuroscience has revealed that this core aspect of cognition can actually be bolstered with behavioral training—and the means might be as simple as an exercise in counting cards.
By Julie Sedivy
Is the film industry guilty of lowballing the intelligence of its audience? It’s not hard to find bloggers, critics and movie insiders (including actor Colin Firth) who think so. A common criticism is that Hollywood seems to believe that viewers are bereft of any creative thought or imagination, and simply want to ingest a pasty mush of cozy clichés, simplistic story lines and cartoon characters. Audiences, the complaint goes, simply aren’t being asked to do any work. This criticism implies that being made to do some mental work is a vital part of what makes a movie rewarding and pleasurable.
Film critic Katherine Monk clearly buys into this view, but offers an original slant: in a recent article for the Vancouver Sun, she blames sophisticated visual effects technology for what she argues is the growing trend to treat viewers as passive sets of eyeballs detached from human imaginations. The problem, she writes, is that current technology has gotten too good at depicting reality, robbing us of the opportunity to construct our own with whatever materials the movie is able to offer.
Asked to describe the five senses, most of us can rattle them off without hesitation: sight, sound, smell, taste and touch. But what do those words mean, and do they mean the same thing to every person?
Take sound, for example. Randall Poster, who has worked as a music supervisor on movies ranging from School of Rock and Velvet Goldmine to Moonrise Kingdom, believes the inherent audience experience of a score or soundtrack has changed.
While researching music for the HBO series Boardwalk Empire, Poster unearthed a treasure trove of “photoplay music,” sheet music written for the musicians performing live at local nickelodeons in the silent film era. The titles of photoplay compositions—“In a Merry Mood” and “Agitato Mysterioso,” for example—reveal the emotional response from the audience that the music would provoke. It’s a response, however, that makes assumptions about its audience’s culture.
“Music renders the collective psychology of the moment, of the human condition,” said Poster, speaking at a seminar Monday at the TED2013 conference in Long Beach, Calif. “But what sounded suspenseful in 1920 may not sound suspenseful today.”
By Pete Etchells The Internet is a wonderful, terrifying thing. On the one hand, it gives us instant access to literally all of humanity’s collected knowledge, and connects us to those that we know and love. On the other, it all too often exposes an awful side to people who shroud themselves in anonymity in order to hurt others. When it comes to mental health, this darker reflection of the Internet can cause lots of serious problems. Thankfully though, an increasing number of people are exploiting the positive potential of the web, using digital tools in innovative ways to help both patients and professionals.
One such person is Kathy Griffiths. In the treatment of depression, “historically, peer-to-peer support has not been taken very seriously by professionals,” Griffiths says. She set out to change that in a recent study published in PLOS ONE, in which she and colleagues at the Australian National University in Canberra conducted a randomized controlled trial looking at the use of online support groups for treating depression.
Depressed individuals were assigned to one of four groups: an Internet Support Group, an online self-help program, both, or neither. The researchers found that the virtual support group on its own, and in combination with the web-based training tool, significantly reduced depression symptoms in participants up to a year after they’d taken part.