In the South Pacific there is a place so remote that few people have ever heard of it, let alone seen it: the Trobriand Islands. The Trobriands are located off the east coast of Papua New Guinea, and no white man had set foot there until the late 1700s. During World War I, however, the islands were visited by a man who would one day become a legend in the field of anthropology, Bronislaw Malinowski. Malinowski was a stork of a man—thin, pale, and balding—often seen wearing a pith helmet and socks up to his knees. He had terrible eyesight, was a hypochondriac, an insomniac, and on top of it all had a strong fear of the tropics—in particular, an abhorrence of the heat and the sultriness; to cope, he gave himself injections of arsenic.
Malinowski was, nonetheless, a keen observer of humankind. And as he watched the Trobriand Islanders go about their lives, he noticed something odd. When the islanders went fishing their behavior changed, depending on where they fished. When they fished close to shore—where the waters were calm, the fishing was consistent, and the risk of disaster was low—superstitious behavior among them was nearly nonexistent.
But when the fishermen sailed for open seas—where they were far more vulnerable and their prospects far less certain—their behavior shifted. They became very superstitious, often engaging in elaborate rituals to ensure success. In other words, a low sense of control had produced a high need for superstition. One, in effect, substituted for the other.
When Linda May went in to see her obstetrician during her first pregnancy, he told her she probably shouldn’t jump, run, or even walk. But May, an exercise physiologist who studies pregnant women and their babies, knew a thing or two about the positive ways that being active can help a mom-to-be’s health. Women who exercise with baby on board have been known to have, among other things, lower risks of gestational diabetes and pregnancy-induced high blood pressure than those who don’t.
Since then, May and other researchers have discovered even more ways that prenatal exercise benefits not only an expectant mother, but her growing baby, too—sometimes for years into the future—as attendees learned at last week’s Experimental Biology 2014 meeting in San Diego.
Decades ago, many more doctors gave similar advice to May’s obstetrician. Pregnancy was thought to be almost like an illness, a time when women needed to rest to protect themselves and their babies. In 1985, the American Congress of Obstetricians and Gynecologists came out with their first set of guidelines for exercise during pregnancy—guidelines, now considered conservative, that included suggestions like keeping strenuous activities to 15 minutes or less.
Since then, research has turned that idea on its head. Exercise is now thought to be—for most women with healthy pregnancies—a boon for the mother’s health, and for the baby she carries as well. Researchers are now starting to look even more closely at how exercise can influence a baby’s health in the womb and how these effects might translate into protection from future health problems.
Excerpted from You Are Here by Hiawatha Bray
These days new smartphone apps all seem to want the same thing from us—our latitude and longitude. According to a 2012 report from the Pew Research Center’s Internet and American Life Project, three-quarters of America’s smartphone owners use their devices to retrieve information related to their location—driving directions, dining suggestions, weather updates, the nearest ATM. Such location data is a boon to advertisers, who use information on our movements to discern our habits and interests, and then target ads to us.
With location-aware smartphones, advertisers can transcend the merely local. They can begin beaming us hyperlocal advertising, tailored not just to the city, but to a particular city block. The idea is called “geofencing,” an unfortunate name choice that evokes the ankle bracelets sometimes worn by accused criminals under constant surveillance. The earliest such devices fenced in the user by transmitting a radio signal to a box connected to his home telephone line. If the suspect left the building, the radio signal would fade, and the box would place an automated phone call to the cops.
With the addition of GPS and cellular technology, later versions of ankle bracelet technology allowed a greater measure of mobility. A judge might grant a criminal suspect permission to go to her job, her church, and her local supermarket, with each approved location plugged into the court’s computer system. Data from the ankle-strapped GPS could confirm that the suspect was staying out of mischief or send a warning to police when she paid an unauthorized visit to the local dive bar.
Geofencing also has uses for the law abiding. A company called Life360 uses it to help parents keep tabs on their kids. The service homes in on location data from a child’s phone and sends a digital message whenever the kid arrives at home or at school—and whenever he leaves. Stroll off campus at ten in the morning, and the parents instantly know. As of late 2012, Life360 had signed up about 25 million users.
It’s long been known that blind people are able to compensate for their loss of sight by using other senses, relying on sound and touch to help them “see” the world. Neuroimaging studies have backed this up, showing that in blind people brain regions devoted to sight become rewired to process touch and sound as visual information.
Now, in the age of Google Glass, smartphones and self-driving cars, new technology offers ever more advanced ways of substituting one sensory experience for another. These exciting new devices can restore sight to the blind in ways never before thought possible.
One approach is to use sound as a stand-in for vision. In a study published in Current Biology, neuroscientists at the Hebrew University of Jerusalem used a “sensory substitution device” dubbed “the vOICe” (Oh, I See!) to enable congenitally blind patients to see using sound. The device translates visual images into brief bursts of music, which the participants then learn to decode.
Over a series of training sessions they learn, for example, that a short, loud synthesizer sound signifies a vertical line, while a longer burst equates to a horizontal one. Ascending and descending tones reflect the corresponding directions, and pitch and volume relay details about elevation and brightness. Layering these sound qualities and playing several in sequence (each burst lasts about one second) thus gradually builds an image as simple as a basic shape or as complex as a landscape.
The concept has tried and true analogs in the animal world, says Dr. Amir Amedi, the lead researcher on the study. “The idea is to replace information from a missing sense by using input from a different sense. It’s just like bats and dolphins use sounds and echolocation to ‘see’ using their ears.”
This article was originally published on The Conversation.
Last week, scientists announced the discovery of Kepler-186f, a planet 492 light years away in the Cygnus constellation. Kepler-186f is special because it marks the first planet almost exactly the same size as Earth orbiting in the “habitable zone” – the distance from a star in which we might expect liquid water, and perhaps life.
What did not make the news, however, is that this discovery also slightly increases how much credence we give to the possibility of near-term human extinction. This because of a concept known as the Great Filter.
The Great Filter is an argument that attempts to resolve the Fermi Paradox: why have we not found aliens, despite the existence of hundreds of billions of solar systems in our galactic neighborhood in which life might evolve? As the namesake physicist Enrico Fermi noted, it seems rather extraordinary that not a single extraterrestrial signal or engineering project has been detected (UFO conspiracy theorists notwithstanding).
This apparent absence of thriving extraterrestrial civilizations suggests that at least one of the steps from humble planet to interstellar civilization is exceedingly unlikely. The absence could be caused because either intelligent life is extremely rare or intelligent life has a tendency to go extinct. This bottleneck for the emergence of alien civilizations from any one of the many billions of planets is referred to as the Great Filter.
This post was originally published at The American Scholar.
Gestures are simple enough. Right? A spontaneous but well-timed wave can emphasize an idea, brush aside a compliment, or point out a barely obscured bird’s nest to an obtuse friend. We use gestures to help our listeners follow along, and we make ourselves look warm and magnanimous in the process.
But on the other hand—and when you’re talking about hands, the puns come furiously—sometimes gestures seem to have nothing to do with edifying or impressing. We gesture on the phone, and in the dark, and when we talk to ourselves. Blind people gesture to other blind people. A growing body of research suggests that we gesture in part for the cognitive benefits that it affords us. Tell us not to use the letter r, or challenge us to adopt a more obscure vocabulary, and our gesture use jumps.
Some people call left-handers southpaws. Others call them mollydookers or corky dobbers. Scientists still often call lefties sinister, which in Latin originally just meant “left” but later came to be associated with evil.
Wondering about the medical implications of being born a corky dobber? It may surprise you that left-handed women were found to be twice or more likely to develop premenopausal breast cancer than right-handers. And a few researchers believe this effect may be linked to exposure to certain chemicals in utero, affecting your genes and then setting the stage for both left-handedness and cancer susceptibility, thus opening up another probability of nurture changing nature.
When it comes to our hands, feet, and even our eyes, most human beings are right-side dominant. Now, you might think that footedness and handedness are always aligned, but as it turns out that’s not always the case for right-handed people, and it’s even more infrequent for left-handed people. Lots of people aren’t congruent.
In board sports, being left-foot dominant is termed goofy – a goofy-footed surfer stands with her left foot on the back of board instead of her right. There are an amazing number of theories as to why some of us are goofy-footed. But the term itself is often said to have originated with an eight-minute long Walt Disney animated short, called Hawaiian Holiday, that was first released to theaters in 1937. The color cartoon stars the usual suspects: Mickey and Minnie, Pluto and Donald, and, of course, Goofy. During the gang’s vacation in Hawaii, Goofy attempts to surf, and when he finally catches a wave and heads back to shore atop its short-lived crest, he’s standing with his right foot forward and his left foot back.
If you’re wondering if you might be goofy and would like to find out before hitting the beach, then imagine yourself at the bottom of a staircase that you’re about to ascend. Which foot moves first? If you’re taking that first imaginary step with your left foot, then it’s likely that you’re a member of the goofy-footed club. And if you find out that you aren’t goofy, then you’re in the majority.
“Interdisciplinary” is a huge buzzword in academia right now. But for science, it has a long history of success. Some of the best science happens when researchers cross-pollinate, applying knowledge from other fields to inform their research.
One of the best such examples in physics was the concept of a Higgs field, which led to the 2013 Nobel Prize in physics. Few people outside the physics community know that the insight to the behavior of the proposed Higgs particle actually came from solid state physics, a branch of study that looks at the processes that take place inside condensed matter such as a superconductor.
Now cosmologists are trying to borrow some ideas of their own. The new discovery of gravitational waves — the biggest news in cosmology this century — focuses fresh attention on a field in which recent progress has otherwise been slow. Cosmologists are now attempting to explore novel ways of trying to understand what happened in the Big Bang, and what, if anything, caused the gargantuan explosion believed to have launched our universe on its way. To do so they’ve turned their attention to areas of physics far removed from outer space: hydrology and turbulence. The idea is pretty clever: to view the universe as an ocean.
I tried not to panic. I was floating effortlessly in a pitch-black tank filled with salty, skin-temperature water, wearing earplugs and nothing else. Within minutes I could no longer feel the sponge in my ears or smell the musty scent of water. There was no light, no smell, no touch and – save for the gasping of my breath and drumming of my heart – no sound.
I was trying out North America’s avant garde drug: sensory deprivation. Across the continent “float houses” are increasing in popularity, offering eager psychonauts a chance to explore this unique state of mind. Those running the business are quick to list the health benefits of frequent “floats”, which range from the believable – relaxation, heightened senses, pain management – to the seemingly nonsensical (“deautomatization”, whatever that means). Are these proclaimed benefits backed up by science or are they simply new-age hogwash?
Why would anyone willingly subject him or herself to sensory deprivation? You’ve probably heard the horror stories: the Chinese using restricted stimulation to “brainwash” prisoners of war during the Korean War; prisons employing solitary confinement as psychological torture. Initial research studies into the psychophysical effects of sensory deprivation, carried out in the 1950s at McGill University, further damaged its reputation, reporting slower cognitive processing, hallucinations, mood swings and anxiety attacks among the participants. Some researchers even considered sensory deprivation an experimental model of psychosis.
However, despite popular belief, sensory deprivation is not inherently unpleasant. According to Dr. Peter Suedfeld, a pioneering psychologist in the field, these stories are rubbish. “(The prisoners) were bombarded with overstimulation – loud group harangues, beatings and other physical tortures,” he explained. Similarly, the original studies at McGill University used constant noise and white light – that is, sensory overload – rather than deprivation.
In fact, an analysis in 1997 of well over 1,000 descriptions of sensory deprivation indicated that more than 90% of subjects found it deeply relaxing. To escape the provocative name of “sensory deprivation” and its negative connotations, in the late 1970s Suedfeld’s protégé, Dr. Roderick Borrie, redubbed the experience with a friendlier name: REST, or Restricted Environmental Stimulation Therapy.
Today, the two most frequently used REST methods are chamber REST, which involves the participant lying on a bed in a dark, soundproof room, and flotation REST, which involves floating in buoyant liquid in a light- and sound-proof tank. The latter, first developed by John Lilly in the 1970s and now widely commercialized, is what I decided to experience myself.
A version of this article originally appeared at The Conversation.
There could be a way of predicting – and preventing – which children will go on to have low intelligence, according to the findings of a study researchers at Cardiff University presented on Monday. They discovered that children with two copies of a common gene (Thr92Ala), together with low levels of thyroid hormone are four times more likely to have a low IQ. This combination occurs in about 4% of the UK population.
Importantly, if you had just one of these factors, but not both, there did not appear to be an increased risk of low intelligence. These are early results, but suggest that it might be possible to treat children early with thyroid hormone supplementation to enhance their intelligence. This raises many ethical issues.
A common objection is that being smarter does not make your life better. In this study, researchers were concerned with those with an IQ between 70-85. Below 70 is classified as intellectual disability but an IQ of 70 to 75 is similar to mild intellectual disability.
Even for individuals with an IQ between 75 and 90 there are still significant disadvantages. Job opportunities tend to be the least desirable and least financially rewarding, requiring significant oversight. More than half the people with this IQ level fail to reach the minimum recruitment standards for the US military. Individuals with this lower level of intelligence are at significant risk of living in poverty (16%), being a chronic welfare dependent (17%) and dropping out of school (35%) compared to individuals with average intelligence. Studies show that they also face an increased risk of incarceration and being murdered.
Linda Gottfredson, who’s undertaken much of this research, concludes that at the very least, “an IQ of 75 is perhaps the most important threshold in modern life”. So it is clear that those of low-normal intelligence, although not classified as disabled, are significantly disadvantaged.
If we could enhance their intelligence, say with thyroid hormone supplementation, we should.