By Deborah Blum
It’s been more than a decade since scientists first raised an alarm about arsenic levels in rice—an alarm based on the realization that rice plants have a natural ability to absorb the toxic element out of the soil.
Since then study after study has confirmed that rice products contain more arsenic than those of any other grain. In response, consumer health advocates have pushed for regulatory agencies to set a safety standard for rice (more on that story in my forthcoming feature story in the October 2013 issue of Discover).
China, a high rice-consumption country, has already moved to do so. The World Health Organization is currently taking comments on a proposed safety standard. And last year—in a somewhat grudging response to pressure from activist groups in this country—the U.S. Food and Drug Administration announced that it was also studying the issue.
And studying and studying, apparently. Although the FDA released some data on arsenic contamination of rice last fall—in direct response to a comprehensive report on the issue from Consumers Union researchers—the agency has yet to provide any further information or to set a deadline on when it might set a protective limit.
In frustration, public health researchers at Consumers Union and the attorney general of Illinois, Lisa Madigan, last month wrote to the FDA asking why the agency was moving so slowly to protect American consumers, underlining the point that the agency’s preliminary results found the taint of arsenic in pretty much every rice product tested.
By Linda Marsa
The following excerpt from Marsa’s forthcoming book, “Fevered: How a Hotter Planet Will Harm Our Health and How We Can Save Ourselves,” was originally published on PLOS Blogs as part of their series “The Science of Extinction and Survival: Conversations on Climate Change.”
The wild swings in weather that are expected to become commonplace as the planet gets warmer—more frequent and severe droughts, followed by drenching rains—change ecosystems in a way that awaken and expedite the transmission of once dormant diseases.
Intriguingly, this type of weather pattern may be what led to the fall of the once mighty Aztec Empire in the early 16th century–and not as is commonly held, by the invasion of European colonialists, who brought with them diseases like mumps, measles and smallpox for which the native populations lacked immunity.
By Erik Vance
A decade ago, Joe Slowinski of the California Academy of Sciences went into the jungles of Myanmar in search of new species of snakes and other vertebrates. One morning, he groggily reached into a bag of vipers he thought to be harmless and when he pulled his hand out, attached to it was a banded krait – among the most poisonous snakes on the planet.
He knew he was in trouble, and to make matters worse, the date was September 11, 2001. He held on for a little more than a day, but with all the chaos back in the US, an emergency evacuation never materialized. He eventually succumbed to the krait’s neurotoxin.
Last week, a team led by an Academy doctor reported a new method of delivery for a well-known hospital drug that might have saved Slowinski’s life, had it been available. But more importantly, the travel-friendly nasal spray might someday save some of the estimated 125,000 people across the world who die from snakebites every year.
Iodized salt is so commonplace in the U.S. today that you may never have given the additive a second thought. But new research finds that humble iodine has played a substantial role in cognitive improvements seen across the American population in the 20th century.
Iodine is a critical micronutrient in the human diet—that is, something our bodies can’t synthesize that we have to rely on food to obtain
—and it’s been added to salt (in the form of potassium iodide) since 1924. Originally, iodization was adopted to reduce the incidence of goiter, an enlargement of the thyroid gland. But research since then has found that iodine also plays a crucial role in brain development, especially during gestation.
today is the leading cause of preventable mental retardation in the world. It’s estimated that nearly one-third of the world’s population has a diet with too little iodine in it, and the problem isn’t limited to developing countries—perhaps one-fifth of those cases are in Europe (pdf), where iodized salt is still not the norm.
By Carrie Arnold
Despite its name, the Paleo Diet is a new food trend, one which has become increasingly popular in recent years. The diet’s basic tenet is that our bodies haven’t yet evolved to cope with the changes to our food intake as a result of agriculture. Paleo Diet aficionados hold that grains like wheat are making us fat and unhealthy, and that we would be far better off if we ate how our ancient ancestors did, focusing on lean meats, fruits and vegetables.
What researchers haven’t been able to answer, however, is exactly what our ancestors ate. Early humans and our other hominin predecessors lived pretty much everywhere, in environments as diverse as the Arctic, tropical rainforests and deserts, and so its likely that diet varied by region. Even within a given region, reconstructions of diet have had to rely on tooth analysis or bones found nearby.
A quartet of papers published today in Proceedings of the National Academy of Sciences have instead turned to stable isotope analysis, which analyzes the specific chemical signature of molecules, to determine the diets of a variety of ancient hominin species by looking at their fossilized teeth. The findings show that human ancestors started moving away from the traditional ape diet of fruit and leaves about 2.5 million years ago—much earlier than previously thought. Thus, even our “paleo” ancestors may never have eaten a paleo diet.
By Eliza Strickland
What can you learn from getting your genome sequenced? If you’re a relatively healthy person like me, the answer is, not much… at least not yet.
I embarked on a mission to get myself sequenced for my recent article “The Gene Machine and Me.” The article focused on the sequencing technology that will soon enable a full scan of a human genome for $1000, and to make the story come alive, I decided to go through the process myself. I got my DNA run through the hottest new sequencing machine, the Ion Proton, and had it analyzed by some of the top experts on genome sequencing, a team at Houston’s Baylor College of Medicine.
The Baylor team has been intimately involved in many of the most important advances of genome sequencing over the last decade. And their accomplishments reveal both the astoundingly rapid progress of the technology, and how far we have yet to go. Here’s a synopsis: the story of five genomes.
By Carrie Arnold
Those with a sensitive sniffer are treated to the pleasure of subtle differences between an ’84 Bordeaux and an ’87 Cabernet, or the ability to tell whether the diner down the street is having a special on onion rings or fries. Even the non-foodies among us can tell whether a carton of milk has expired with a single sniff. But new research hints that the function of taste and smell receptors go far beyond our gourmand aspirations. Scientists have found that the proteins we use to detect certain tastes and scents are actually an important part of our immune system.
Of all the classes of taste (sweet, sour, salty, bitter and umami), humans are the best at detecting bitter, and for good reason. Many of the toxins found in food are bitter, and being able to sense these in minute qualities was a great evolutionary boon to staying alive and healthy. Not surprisingly, bitter taste receptors are found in large quantities on the tongue. But a 2009 study in Science also found these receptors deep in the lungs. Otorhinolaryngologist and sinus surgeon Noam Cohen at the University of Pennsylvania went spelunking through the nose—his area of expertise—to see whether it might contain the same receptors, and found that it did.
By Pete Etchells
I was a gamer kid. Heck, I still am a gamer kid. And like any form of media, old or new, video games have had their fair share of negative airtime. Much like how comic books were vilified in the 1950s for corroding young and impressionable minds (although the research behind those claims is now in dispute), video games are similarly being scrutinized for their effects on development and behavior.
But a relatively new branch of science is focusing on the therapeutic aspects of video games. This new generation of researchers who have grown up with video games are starting to use their unique mix of skills to look into the possibility of improving people’s lives through gaming. And there’s three promising areas where games appear to have a unique leg up on traditional therapies.
Scott Firestone works as a researcher in evidence-based surgery, and recently started blogging about public health and environmental issues at His Science Is Too Tight, where this post originally appeared. You can find him on Twitter at @scottfirestone.
Kevin Drum from Mother Jones has a fascinating new article detailing the hypothesis that exposure to lead, particularly tetraethyl lead (TEL), explains the rise and fall of violent crime rates from the 1960s through the 1990s—at which point the compound was phased out of gasoline worldwide. It’s a good bit of public health journalism compared to much of what you see, but I’d like to provide a little bit of epidemiology background to the article. There’s so many studies listed that it’s a really good intro to the types of study designs you’ll see in public health. It also illustrates the concept of confirmation bias, and why regulatory agencies seem to drag their feet even in the face of such compelling stories as this one.
Drum correctly notes that the correlation is insufficient to draw any conclusions regarding causality. The research (pdf) published by economist Rick Nevin was simply looking at associations, and saw that the curves were heavily correlated, as you can quite clearly see. When you look at data involving large populations, such as violent crime rates, and compare with an indirect measure of exposure to some environmental risk factor such as levels of TEL in gasoline during that same time, the best you can say is that your alternative hypothesis of there being an association (null hypothesis always being no association) deserves more investigation. This type of design is called a cross-sectional study, and it’s been documented that values for a population do not always match those of individuals when looking at cross-sectional data.
Keith Kloor is a freelance journalist whose stories have appeared in a range of publications, from Science to Smithsonian. Since 2004, he’s been an adjunct professor of journalism at New York University. You can find him on Twitter @KeithKloor.
Last month, a group of Massachusetts residents filed an official complaint claiming that the wind turbine in their town is making them sick. According to the article in the Patriot Ledger, the residents “said they’ve lost sleep and suffered headaches, dizziness and nausea as a result of the turbine’s noise and shadow flicker [flashing caused by shadows from moving turbine blades].” A few weeks later, a story from Wisconsin highlighted similar complaints of health problems associated with wind turbines there.
Anecdotal claims like these are on the rise and not just in the United States. A recent story in the UK’s Daily Mail catalogs a litany of health ailments supposedly caused by wind turbines—everything from memory loss and dizziness to tinnitus and depression.
I expect so. For one thing, the alleged health problem has been adopted by demagogues and parroted on popular climate-skeptic websites. But the bigger problem is that “wind turbine syndrome” is what is known as a “communicated” disease, says Simon Chapman, a professor of public health at the University of Sydney. The disease, which has reached epidemic proportions in Australia, “spreads via the nocebo effect by being talked about, and is thereby a strong candidate for being defined as a psychogenic condition,” Chapman wrote several months ago in The Conversation.
What Chapman is describing is a phenomenon akin to mass hysteria—an outbreak of apparent health problems that has a psychological rather than physical basis. Such episodes have occurred throughout human history; earlier this year, a cluster of teenagers at an upstate New York high school were suddenly afflicted with Tourette syndrome-like symptoms. The mystery outbreak was attributed by some speculation to environmental contaminants.
But a doctor treating many of the students instead diagnosed them with a psychological condition called “conversion disorder,” as described by psychologist Vaughan Bell on The Crux: