Some people call left-handers southpaws. Others call them mollydookers or corky dobbers. Scientists still often call lefties sinister, which in Latin originally just meant “left” but later came to be associated with evil.
Wondering about the medical implications of being born a corky dobber? It may surprise you that left-handed women were found to be twice or more likely to develop premenopausal breast cancer than right-handers. And a few researchers believe this effect may be linked to exposure to certain chemicals in utero, affecting your genes and then setting the stage for both left-handedness and cancer susceptibility, thus opening up another probability of nurture changing nature.
When it comes to our hands, feet, and even our eyes, most human beings are right-side dominant. Now, you might think that footedness and handedness are always aligned, but as it turns out that’s not always the case for right-handed people, and it’s even more infrequent for left-handed people. Lots of people aren’t congruent.
In board sports, being left-foot dominant is termed goofy – a goofy-footed surfer stands with her left foot on the back of board instead of her right. There are an amazing number of theories as to why some of us are goofy-footed. But the term itself is often said to have originated with an eight-minute long Walt Disney animated short, called Hawaiian Holiday, that was first released to theaters in 1937. The color cartoon stars the usual suspects: Mickey and Minnie, Pluto and Donald, and, of course, Goofy. During the gang’s vacation in Hawaii, Goofy attempts to surf, and when he finally catches a wave and heads back to shore atop its short-lived crest, he’s standing with his right foot forward and his left foot back.
If you’re wondering if you might be goofy and would like to find out before hitting the beach, then imagine yourself at the bottom of a staircase that you’re about to ascend. Which foot moves first? If you’re taking that first imaginary step with your left foot, then it’s likely that you’re a member of the goofy-footed club. And if you find out that you aren’t goofy, then you’re in the majority.
“Interdisciplinary” is a huge buzzword in academia right now. But for science, it has a long history of success. Some of the best science happens when researchers cross-pollinate, applying knowledge from other fields to inform their research.
One of the best such examples in physics was the concept of a Higgs field, which led to the 2013 Nobel Prize in physics. Few people outside the physics community know that the insight to the behavior of the proposed Higgs particle actually came from solid state physics, a branch of study that looks at the processes that take place inside condensed matter such as a superconductor.
Now cosmologists are trying to borrow some ideas of their own. The new discovery of gravitational waves — the biggest news in cosmology this century — focuses fresh attention on a field in which recent progress has otherwise been slow. Cosmologists are now attempting to explore novel ways of trying to understand what happened in the Big Bang, and what, if anything, caused the gargantuan explosion believed to have launched our universe on its way. To do so they’ve turned their attention to areas of physics far removed from outer space: hydrology and turbulence. The idea is pretty clever: to view the universe as an ocean.
I tried not to panic. I was floating effortlessly in a pitch-black tank filled with salty, skin-temperature water, wearing earplugs and nothing else. Within minutes I could no longer feel the sponge in my ears or smell the musty scent of water. There was no light, no smell, no touch and – save for the gasping of my breath and drumming of my heart – no sound.
I was trying out North America’s avant garde drug: sensory deprivation. Across the continent “float houses” are increasing in popularity, offering eager psychonauts a chance to explore this unique state of mind. Those running the business are quick to list the health benefits of frequent “floats”, which range from the believable – relaxation, heightened senses, pain management – to the seemingly nonsensical (“deautomatization”, whatever that means). Are these proclaimed benefits backed up by science or are they simply new-age hogwash?
Why would anyone willingly subject him or herself to sensory deprivation? You’ve probably heard the horror stories: the Chinese using restricted stimulation to “brainwash” prisoners of war during the Korean War; prisons employing solitary confinement as psychological torture. Initial research studies into the psychophysical effects of sensory deprivation, carried out in the 1950s at McGill University, further damaged its reputation, reporting slower cognitive processing, hallucinations, mood swings and anxiety attacks among the participants. Some researchers even considered sensory deprivation an experimental model of psychosis.
However, despite popular belief, sensory deprivation is not inherently unpleasant. According to Dr. Peter Suedfeld, a pioneering psychologist in the field, these stories are rubbish. “(The prisoners) were bombarded with overstimulation – loud group harangues, beatings and other physical tortures,” he explained. Similarly, the original studies at McGill University used constant noise and white light – that is, sensory overload – rather than deprivation.
In fact, an analysis in 1997 of well over 1,000 descriptions of sensory deprivation indicated that more than 90% of subjects found it deeply relaxing. To escape the provocative name of “sensory deprivation” and its negative connotations, in the late 1970s Suedfeld’s protégé, Dr. Roderick Borrie, redubbed the experience with a friendlier name: REST, or Restricted Environmental Stimulation Therapy.
Today, the two most frequently used REST methods are chamber REST, which involves the participant lying on a bed in a dark, soundproof room, and flotation REST, which involves floating in buoyant liquid in a light- and sound-proof tank. The latter, first developed by John Lilly in the 1970s and now widely commercialized, is what I decided to experience myself.
A version of this article originally appeared at The Conversation.
There could be a way of predicting – and preventing – which children will go on to have low intelligence, according to the findings of a study researchers at Cardiff University presented on Monday. They discovered that children with two copies of a common gene (Thr92Ala), together with low levels of thyroid hormone are four times more likely to have a low IQ. This combination occurs in about 4% of the UK population.
Importantly, if you had just one of these factors, but not both, there did not appear to be an increased risk of low intelligence. These are early results, but suggest that it might be possible to treat children early with thyroid hormone supplementation to enhance their intelligence. This raises many ethical issues.
A common objection is that being smarter does not make your life better. In this study, researchers were concerned with those with an IQ between 70-85. Below 70 is classified as intellectual disability but an IQ of 70 to 75 is similar to mild intellectual disability.
Even for individuals with an IQ between 75 and 90 there are still significant disadvantages. Job opportunities tend to be the least desirable and least financially rewarding, requiring significant oversight. More than half the people with this IQ level fail to reach the minimum recruitment standards for the US military. Individuals with this lower level of intelligence are at significant risk of living in poverty (16%), being a chronic welfare dependent (17%) and dropping out of school (35%) compared to individuals with average intelligence. Studies show that they also face an increased risk of incarceration and being murdered.
Linda Gottfredson, who’s undertaken much of this research, concludes that at the very least, “an IQ of 75 is perhaps the most important threshold in modern life”. So it is clear that those of low-normal intelligence, although not classified as disabled, are significantly disadvantaged.
If we could enhance their intelligence, say with thyroid hormone supplementation, we should.
In 1917, a year after Albert Einstein’s general theory of relativity was published—but still two years before he would become the international celebrity we know—Einstein chose to tackle the entire universe. For anyone else, this might seem an exceedingly ambitious task—but this was Einstein.
Einstein began by applying his field equations of gravitation to what he considered to be the entire universe. The field equations were the mathematical essence of his general theory of relativity, which extended Newton’s theory of gravity to realms where speeds approach that of light and masses are very large. But his math was better than he wanted to believe—his equations told him that the universe could not stay static: it had to either expand or contract. Einstein chose to ignore what his mathematics was telling him.
The story of Einstein’s solution to this problem—the maligned “cosmological constant” (also called lambda)—is well known in the history of science. But this story, it turns out, has a different ending than everyone thought: Einstein late in life returned to considering his disgraced lambda. And his conversion foretold lambda’s use in an unexpected new setting, with immense relevance to a key conundrum in modern physics and cosmology: dark energy.
There’s something rejuvenating about escaping civilization for the quiet isolation of unadulterated wilderness. But could you leave it all behind — forever? That’s the fate that awaits the men and women still in contention for a one-way ticket to the Red Planet.
Blood samples are an invaluable tool, but often they’re just the tip of the diagnostic iceberg, something that determines whether additional, more sensitive tests and scans might be necessary. But new technology may make it possible to use individual cells in a patient’s blood sample to get far more specific and actionable information. A technique being developed by San Diego–based Epic Sciences can determine whether a cancer patient is an appropriate candidate for a drug, and even whether the drug is losing its efficacy.
In research presented last month at the Personalized Medicine World Conference in Palo Alto, CA, Epic described how their technology can be used to reliably pick out rare cells from a blood sample. In the case of cancer, these rare, circulating tumor cells could one day tell an oncologist not only whether a patient’s cancer has returned, but also whether it’s growing resistant to the current treatment regimen—something only expensive scans and invasive biopsies can do with any accuracy today.
This article was originally published on The Conversation.
Most office workers send dozens of electronic communications to colleagues in any given working day, through email, instant messaging and intranet systems. So many in fact that you might not notice subtle changes in the language your fellow employees use.
Instead of ending their email with “See ya!”, they might suddenly offer you “Kind regards.” Instead of talking about “us,” they might refer to themselves more. Would you pick up on it if they did?
These changes are important and could hint at a disgruntled employee about to go rogue. Our findings demonstrate how language may provide an indirect way of identifying employees who are undertaking an insider attack.
My team has tested whether it’s possible to detect insider threats within a company just by looking at how employees communicate with each other. If a person is planning to act maliciously to damage their employer or sneak out commercially sensitive material, the way they interact with their co-workers changes.
The Sochi Olympics are churning out dramatic victories – but athletes aren’t the only ones who fine-tuned their craft to get here. As U.S. bobsledders, skaters and lugers compete during these Games, they’re doing so with cutting-edge technology that’s gone through an equally exhaustive testing process.
These technological upgrades, which look to bolster their respective sports with faster times and improved features, will help athletes stand their best chance yet at scoring the gold this year. Here we take a look at three notable improvements.
With speed skating, the difference between scoring a gold medal and walking home empty-handed is determined by a fraction of a second. To help put U.S. Olympic speed skaters on the winning side of that difference, sporting goods manufacturer
Under Armour and defense contractor Lockheed Martin created the Mach 39 speed skating suit to shave off those precious nanoseconds.
Whereas most suits try to be as slick and aerodynamic as possible, Under Armour went the opposite direction by installing “flow-molding” on the backside of the Mach 39 suits. These strategically placed dimples work like the bumps on a golf ball, cutting back drag that accumulates behind high-velocity objects. “We’re trying to disrupt that air flow before it bulks up behind a skater,” Chief of Innovation Kevin Haley said.
Along with reduced air drag, the suits also cut down on friction generated between the athlete’s thighs as they cross over one another for tight track turns. Dubbed “Armour Glide,” these textiles are strategically located on the athlete’s inner thighs, where t
he most friction—and energy waste—occurs. With the textiles, athletes see a 65% drop in the coefficient of friction between the legs, letting them redirect their strength onto the ice and “put more power into the skates,” Haley said.
Earlier this month, when a few high-traffic news websites reported a strange object or wedge-shaped craft on Google Moon, I was skeptical. Surprised, too, because when I opened the application, there it was, a distinct V-shape of bright lights inside a tiny crater on the moon’s far side. It did not look natural. I marked its location at 142 degrees and 34 minutes east and 22 degrees 42 minutes north, at the edge of Mare Moscoviense.