Exposure to high levels of ionizing radiation is extremely bad for human health. Witness the effects of acute radiation sickness suffered by early scientists studying radioactive elements, or by survivors of atomic bomb blasts. Witness the complex procedures through which doctors must shield cancer patients from radiation therapy, and the long-term complications of adult survivors of cancer who were treated with earlier technology. In light of all this, it’s clear that high doses of ionizing radiation are dangerous.
But the science is less clear when it comes to low dose radiation (LDR). Medical science, the nuclear industry, and government regulatory agencies generally take a play-it-safe approach when considering LDR. In recent years, however, an increasing number of researchers (though still firmly in the minority) have questioned the assumption that all radiation is bad – and have begun studying whether low doses might in fact aid in genetic repair, prevent tissue damage, and other benefits.
One day in October 2010, at a school in the Gaibandha district of northwest Bangladesh, a pupil noticed that the label on a packet of crackers she was eating had darkened. Fearing the crackers were contaminated – “the devil’s deed”, as she put it – she almost immediately fell ill, complaining of heartburn, headache and severe abdominal pain.
The condition quickly spread among her fellow pupils, and later to other schools in the area. Yet toxicologists could trace no contaminant, and all those affected were quickly discharged from the hospital after doctors found no trace of illness. The following week, investigators diagnosed “mass sociogenic illness,” otherwise known as mass hysteria. The children, it seemed, had developed their symptoms simply because they had seen their classmates succumb.
Mass hysteria is thought to be an extreme example of a phenomenon that affects us all day-to-day: emotional contagion. Short of living in hermitic isolation, it is hard to escape it; we are vulnerable to the moods and behaviors of others to an extraordinary degree.
Emotional contagion caused the failure of successive banks at the start of the Great Depression in the 1930s, when investors suffered a collective loss of faith in the ability of these institutions to pay out. It is the force behind fuel crises, health scares and the spread of public grief (for example in Britain after the death of Princess Diana in August 1997). It is the reason why you are more likely to be obese if you have obese friends, and depressed if you are living with a depressed roommate.
But emotional contagion is not all bad – far from it. The mechanism behind it – our tendency to mimic each other’s expressions and behaviors – is crucial to social interaction. Without it, anything beyond superficial communication would be impossible.
Human genetic engineering is not new; it has been going on for a long, long time — naturally. Ancient viruses are really good at inserting themselves and modifying human gene code. Over millennia, constant infections would come to mean that 8 percent of the entire human genome is made up of inserted virus code. All this gene recoding of our bodies occurred under Darwin’s rules, natural selection and random mutation. But nonrandom, deliberate human genetic engineering is new, and it is a big deal.
As of 1990, increasingly genetically modified humans walk among us. More and more gene therapies carry new instructions into our bodies and place them in the right spots; in so doing, they modify our most fundamental selves, our core, heretofore slow-evolving DNA. We are still in the very early stages of effectively hijacking viruses for human-driven purposes; just a few years ago it took a long time to identify and isolate a single faulty gene and figure out what was wrong, never mind finding a way to replace it with a properly functioning alternative. Early gene therapy focused on obscure, deadly orphan diseases like ADA-SCID (the immune disease that “Bubble Boy” had), adrenoleukodystrophy (say that five times fast), Wiskott-Aldrich syndrome, various leukemias, and hemophilia.
In theory the technique is relatively simple: Take a neutered virus, one that is engineered to not harm you but that readily infects human cells to ferry in new DNA instructions, write a new set of genetic instructions into the virus, and let it loose to infect a patient’s cells. And ta‑da! You have a genetically modified human. (Think of this as deliberately sneezing on someone but instead of giving them a cold, you give them a benign infection that enters their body, recodes their cells, and fixes a faulty gene.)
You might have heard that men are wimps when it comes to pain. It can make for lighthearted argument, but in fact it’s not true. Women have a lower pain threshold. Take a man and a woman, put a piece of ice on the backs of their hands, and wait. The woman will almost certainly complain about the pain first.
Not all pain is equal, but women are definitely worse off. In some quite macabre experiments, researchers have shown that women are much more sensitive to electric shocks, muscle pain, hot and cold, and chemical pain, such as the discomfort of eating a vindaloo curry.
If this comes as a surprise to you, you’re not alone. According to surveys, two-thirds of women still think that men feel more pain than they do. (Men are far less convinced of that; only one third think they are worse off when it comes to pain.)
And this isn’t some half-witted attempt to make out that men are the stronger sex. It’s a serious call to the medical system to improve the way they treat women’s pain.
Last fall as the Ebola epidemic continued unabated, experts started discussing something that had never before been bandied about: the idea of Ebola becoming endemic in parts of West Africa. Endemic diseases, like malaria and Lassa fever in that region of Africa, are constant presences. Instead of surfacing periodically, as it always has before now, Ebola in an endemic form would persist in the human population, at low levels of transmission, indefinitely.
The debate was stoked by a paper written by the World Health Organization (WHO) Ebola Response Team and published in October in the New England Journal of Medicine. The sentence that grabbed the world’s attention was saved till near the very end: “For the medium term, at least, we must therefore face the possibility that EVD [Ebola virus] will become endemic among the human population of West Africa, a prospect that has never previously been contemplated.”
What would it mean exactly for Ebola to become endemic, and how would it change things?
This article was originally published on The Conversation.
Food labels seem to provide all the information a thoughtful consumer needs, so counting calories should be simple. But things get tricky because food labels tell only half the story.
A calorie is a measure of usable energy. Food labels say how many calories a food contains. But what they don’t say is that how many calories you actually get out of your food depends on how highly processed it is.
Food-processing includes cooking, blending and mashing, or using refined instead of unrefined flour. It can be done by the food industry before you buy, or in your home when you prepare a meal. Its effects can be big. If you eat your food raw, you will tend to lose weight. If you eat the same food cooked, you will tend to gain weight. Same calories, different outcome.
For our ancestors, it could have meant the difference between life and death. Hundreds of thousands of years ago, when early humans learned to cook they were able to access more energy in whatever they ate. The extra energy allowed them to develop big brains, have babies faster and travel more efficiently. Without cooking, we would not be human.
While developing drugs to cure Ebola is crucial to end the current epidemic, a vaccine that prevents the infection altogether is the end-game for viral outbreaks – a way to protect healthcare workers on the front lines and to prevent future outbreaks.
It typically takes 10 or 20 years to develop and test a vaccine and get it to market. But in Ebola’s case, this time frame has been compressed into a matter of months, bringing pharmaceutical companies, scientists and regulators into uncharted territory, striving for a vaccine to curb the still-escalating epidemic without compromising safety.
“Never before has there been a push to develop a vaccine for an emerging public health threat in this short a time frame,” said Dr. Mark Feinberg, vice president and chief public health and science officer of the drug company Merck’s vaccine division.
Dr. Ripley Ballou, head of Ebola vaccine research for GlaxoSmithKline, concurs. “I’ve been doing this kind of work for 30 years, and this is the first time I’ve encountered anything with the compressed timeline and sense of urgency,” he said.
In the mid-1800s, English chemist William Henry Perkin serendipitously synthesized the first non-natural dye: starting with coal tar, he was hoping to produce the malaria drug quinine but instead created mauve. His discovery revolutionized the textile industry and launched the petrochemical industry. Natural dyes just didn’t have the staying power and vivid colors of the dye Perkin created. Never before had such a steadfast dye been found.
Soon after, August Hofmann (Perkin’s chemistry professor) noticed that a dye he had derived from coal tar formed a color when exposed to air. The molecule responsible was para-phenylenediamine, or PPD, the foundation of most permanent hair dyes today.
Although hair is a protein fiber, like wool, the dyeing process for textiles cannot be duplicated on the head. To get wool to take a dye, you must boil the wool in an acidic solution for an hour. The equivalent for hair is to bathe it in the chemical ammonia. Ammonia separates the protective protein layers, allowing dye compounds to penetrate the hair shaft and access the underlying pigment, melanin.
A defining feature of this Ebola epidemic has been the significant resistance of some of the affected communities to treatment and prevention measures by foreign aid workers and their own governments. Many local people, suspicious and fearful, have refused to go to treatment centers or turn over bodies for safe burial, and whole communities have prohibited the entry of doctors and health teams.
As the months have gone by that resistance has been less reported upon, and there are signs that it may be lessening. In the Forest Region of Guinea, where the Ebola epidemic started, foreign staff previously faced roadblocks, stone-throwing and violent attacks. But in the last few weeks, as the New York Times has reported, locals have opened up the literal and figurative barricades around their villages and sought outside help.
Still, the friction continues to shape the spread of the disease. Doctors Without Borders’ December briefing paper [pdf] calls the situation in Guinea “alarming,” with 25 percent more cases reported in November than October and many areas where there is “still a great deal of resistance towards Ebola response” and their teams are “not welcome.”
The solution, some say, is to reevaluate treatment and prevention tactics with the benefit of an anthropological perspective. That was the call delivered last week by a meeting of the American Anthropological Association in Washington D.C. If international staff had approached the epidemic from day one with more understanding of cultural, historical and political context, attendees said, local traditions and community leaders could have become assets rather than obstacles in the fight against Ebola.
The American Anthropological Association is asking for anthropologists to become more involved in the global Ebola response. They have started the Ebola Emergency Response Initiative to connect anthropologists who are already working in or experienced with West Africa, and to build structures and programs that help more anthropologists spend time directly involved in the Ebola response on the ground.
“We’ve worked in these places and we’re watching our friends die,” said University of Florida professor Sharon Abramowitz, one of the founders of the initiative.
Abramowitz points out that the anthropologists involved in the initiative have a total of 300 years of ethnographic experience in the affected West African nations – experience which could help medical scientists both understand and respond to the epidemic.
The Ebola virus has consistently stayed several steps ahead of doctors, public officials and others trying to fight the epidemic. Throughout the first half of 2014, it spread quickly as international and even local leaders failed to recognize the severity of the situation. In recent weeks, with international response in high gear, the virus has thrown more curve balls.
The spread has significantly slowed in Liberia and beds for Ebola patients are empty even as the U.S. is building multiple treatment centers there. Meanwhile the epidemic has escalated greatly in Sierra Leone, which has a serious dearth of treatment centers. And in Mali, where an incursion was successfully contained in October, a rash of new cases has spread from an infected imam.
Predicting the trajectory of Ebola rather than playing catching-up could do much to help prevent and contain the disease. Some experts have called for prioritizing mobile treatment units that can be quickly relocated to the spots most needed. Figuring out where Ebola is likely to strike next or finding emerging hot spots early on would be key to the placement of these treatment centers.
But such modeling requires data, and lots of it. And for stressed healthcare workers on the ground and government and non-profit agencies scrambling to combat a raging epidemic, collecting and disseminating data is often not a high priority.