For years, medical researchers have been talking about the day when babies will have their whole genomes sequenced at birth, the day when genomic analysis will allow every patient to be treated not just based on her condition but on which treatment is the best match for her genetic quirks. There will be a day, they say, when we will all carry our genomes around on a thumb drive. But the hurdles, fiscal and otherwise, have proven difficult to overcome.
The DNA of one set of human chromosomes contains 3 billion base pairs—most cells are diploid and have two sets of chromosomes, one from each parent. Sequencing these six billion base pairs, one pair at a time, is unquestionably faster and cheaper than it once was: Since its less-than-humble beginnings almost 15 years ago, human genome sequencing has dropped from $100 million to around $1000. Instead of years, it can now be completed in a day or two.
Yet while that’s incredible progress, it’s not quite enough. Not only is it still too pricey for everyday use, but once that genome has been sequenced it also has to be mapped and analyzed—the process in which the sequenced base pairs are assigned to the correct chromosome and assessed for mutations, something that can take a couple of days or more. What to do with the resulting data is another problem: The genome and its resulting analysis typically occupy about 400GB. (For reference, the 2013 laptop I’m using to write this post has a storage capacity of 250GB—my genome wouldn’t come close to fitting on it.) Securely storing data from 500 or 5000 patients—at about $1 per gigabyte—typically costs hundreds of thousands of dollars per year.
Joan Bennett didn’t believe in sick building syndrome. As a specialist in mold toxins, she had even testified in trials in support of insurance companies denying claims to homeowners who claimed that they had been sickened by toxins from their moldy houses.
Then Hurricane Katrina struck, Bennett’s home was flooded, and she evacuated. “A month later, as a form of psychological sublimation, I decided to travel back and sample my home for mold,” she said. Her house smelled horrendous, worse than any mold she’d ever smelled. She donned a mask and gloves and protective gear, but even so, she felt awful – dizziness, headache, malaise. She walked outside and felt better. Then it struck her: “I think there’s something in this terrible mold I’m smelling.”
But she still believed in her old arguments against the theory. She knew how much mold toxin we ordinarily get exposed to from mold in food, and she still knew that it was far greater than any we could breathe from spores in the air.
But the smell of mold was another matter. Most things we can smell are volatile organic compounds (VOCs), and some VOCs are known to make people sick. “I knew that a minor theory was that sick building syndrome might be caused by the VOCs that make fungi smell moldy,” Bennett says. And then she thought, “Ta da! Maybe there is such a thing as sick building syndrome, and maybe it has nothing to do with the fungus toxins I’ve been studying all my life!”
That moment transformed her research career. Along with her house, she’d lost her entire frozen genetic stock of fungi in the storm, because the power had gone out and everything had defrosted. She had to mostly start over anyway, and now she wanted to prove her new theory.
Scientists have called the contraceptive pill one of the most important inventions of the twentieth century. Now, more than fifty years after the Pill was first released, contraception remains a woman’s world.
Sure, men can use condoms or have a vasectomy, but women have a much more dizzying array of options from which to choose. From pills to contraceptive vaginal rings to intrauterine devices and more, most scientists and pharmaceutical companies have focused their contraception efforts on women.
This isn’t necessarily a bad thing. Many reproductive health scientists say that we need more, not fewer, options for contraception. The problem is that virtually all contraception is being geared toward women. That’s largely because, historically, contraception was grouped in with the traditional female concerns of family and childbearing.
“There are a fair number of women who are dissatisfied with their current method of contraception,” said Michael O’Rand, a biologist and male contraception expert at the University of North Carolina at Chapel Hill.
By Jo Adetunji, The Conversation
This article was originally published at The Conversation, an online publication covering the latest research.
James Bond might have been been more shaken than stirred if his intake of alcoholic drinks is anything to go by.
Along with his love of women, Bond also had a keen taste for martinis. And researchers have scoured the books to calculate that the MI6 spy drank over four times the recommended limit each week.
They argue that contrary to helping performance under pressure, this amount would probably have affected his capacity to perform “in all aspects of his life”. As a high-risk category three drinker, in the longer term this would put him at risk of developing alcoholic liver disease, cirrhosis, impotence and alcohol-induced tremor – not great for womanizing or sniper missions – and potentially an early death.
The researchers, who published their paper in the British Medical Journal, also discovered that Bond was often drunk at the wheel.
In many areas of life, tall people seem to get all the benefits. On average, they earn more money. They are more successful at work. Taller people are just more, er, highly regarded than their shorter counterparts.
But research is showing that short people might win out in one big way: they might be less prone to cancer, and even have longer lives, than tall people. Although the jury is still out on how much height affects longevity, it shows no signs of stopping our cultural preference for taller people.
The relationship between height and cancer risk is not especially new—scientists had proposed a link between height and breast cancer in women as early as 1975. Many studies, however, have focused specifically on breast cancer. Other studies have looked at how height affects cancer risk at numerous sites, but they have failed to adequately control for variables that could be affected by height, notes epidemiologist Geoffrey Kabat at the Albert Einstein College of Medicine in the Bronx.
Measuring the link between height and health variables, Kabat says, is much more complicated than determining someone’s height and seeing if they develop a particular disease. “You really want to make very sure that you have excluded the possibility that any association you find between height and cancer is not due to the interference of some sort of other factor,” Kabat said.
For one, taller people tend to weigh more than shorter people, even if their BMI isn’t any higher. For another, poor nutrition and stress can stunt height growth, and higher calorie diets have been associated with increased height. And that doesn’t even begin to take into account the psychosocial variables like increased income, education, and socioeconomic status.
By Jesse Bering
The new Showtime series Masters of Sex is shining light on two remarkable figures in the history of sexology, William Masters and Virginia Johnson. Although most of us may not be aware of their colorful back-story, we have, at least, heard of “Masters and Johnson” before. Along with the famous Alfred Kinsey, they were iconic American figures in 20th-century sex research, widely known for shirking the conservative conventions that kept our forebears in the closet of erotic ignorance.
The history of sexology runs far deeper than a few charismatic figures, however. Their names may not be as familiar to us, but there were many other fascinating early sex researchers who left their own interesting legacies, and not always entirely positive ones at that. Some of these forgotten scholars were, like Masters and Johnson, angels of sexual healing; yet others were, quite frankly, bastards of bigotry.
So without further ado, allow me to introduce you to five early sexologists that you’ve (probably) never heard of… at least, not like this.
By Deborah Blum
It’s been more than a decade since scientists first raised an alarm about arsenic levels in rice—an alarm based on the realization that rice plants have a natural ability to absorb the toxic element out of the soil.
Since then study after study has confirmed that rice products contain more arsenic than those of any other grain. In response, consumer health advocates have pushed for regulatory agencies to set a safety standard for rice (more on that story in my forthcoming feature story in the October 2013 issue of Discover).
China, a high rice-consumption country, has already moved to do so. The World Health Organization is currently taking comments on a proposed safety standard. And last year—in a somewhat grudging response to pressure from activist groups in this country—the U.S. Food and Drug Administration announced that it was also studying the issue.
And studying and studying, apparently. Although the FDA released some data on arsenic contamination of rice last fall—in direct response to a comprehensive report on the issue from Consumers Union researchers—the agency has yet to provide any further information or to set a deadline on when it might set a protective limit.
In frustration, public health researchers at Consumers Union and the attorney general of Illinois, Lisa Madigan, last month wrote to the FDA asking why the agency was moving so slowly to protect American consumers, underlining the point that the agency’s preliminary results found the taint of arsenic in pretty much every rice product tested.
By Linda Marsa
The following excerpt from Marsa’s forthcoming book, “Fevered: How a Hotter Planet Will Harm Our Health and How We Can Save Ourselves,” was originally published on PLOS Blogs as part of their series “The Science of Extinction and Survival: Conversations on Climate Change.”
The wild swings in weather that are expected to become commonplace as the planet gets warmer—more frequent and severe droughts, followed by drenching rains—change ecosystems in a way that awaken and expedite the transmission of once dormant diseases.
Intriguingly, this type of weather pattern may be what led to the fall of the once mighty Aztec Empire in the early 16th century–and not as is commonly held, by the invasion of European colonialists, who brought with them diseases like mumps, measles and smallpox for which the native populations lacked immunity.
By Erik Vance
A decade ago, Joe Slowinski of the California Academy of Sciences went into the jungles of Myanmar in search of new species of snakes and other vertebrates. One morning, he groggily reached into a bag of vipers he thought to be harmless and when he pulled his hand out, attached to it was a banded krait – among the most poisonous snakes on the planet.
He knew he was in trouble, and to make matters worse, the date was September 11, 2001. He held on for a little more than a day, but with all the chaos back in the US, an emergency evacuation never materialized. He eventually succumbed to the krait’s neurotoxin.
Last week, a team led by an Academy doctor reported a new method of delivery for a well-known hospital drug that might have saved Slowinski’s life, had it been available. But more importantly, the travel-friendly nasal spray might someday save some of the estimated 125,000 people across the world who die from snakebites every year.
Iodized salt is so commonplace in the U.S. today that you may never have given the additive a second thought. But new research finds that humble iodine has played a substantial role in cognitive improvements seen across the American population in the 20th century.
Iodine is a critical micronutrient in the human diet—that is, something our bodies can’t synthesize that we have to rely on food to obtain
—and it’s been added to salt (in the form of potassium iodide) since 1924. Originally, iodization was adopted to reduce the incidence of goiter, an enlargement of the thyroid gland. But research since then has found that iodine also plays a crucial role in brain development, especially during gestation.
today is the leading cause of preventable mental retardation in the world. It’s estimated that nearly one-third of the world’s population has a diet with too little iodine in it, and the problem isn’t limited to developing countries—perhaps one-fifth of those cases are in Europe (pdf), where iodized salt is still not the norm.