The Nicoya peninsula in northwestern Costa Rica is one of the most beautiful places on the planet. This 75-mile sliver of land, just south of the Nicaraguan border, is covered with cattle pastures and tropical rain forests that stretch down to the crashing waves of the Pacific Ocean. The coastline is dotted with enclaves of expats who fill their time surfing, learning yoga and meditating on the beach.
For the locals, life is not so idyllic. They live in small, rural villages with limited access to basics such as electricity, linked by rough tracks that are dusty in the dry season and often impassable when it rains. The men earn a living by fishing and farming, or work as laborers or sabaneros (cowboys on huge cattle ranches), while the women cook on wood-burning stoves. Yet Nicoyans have a surprising claim to fame that is attracting the attention of scientists from around the world.
Their secret was uncovered in 2005 by Luis Rosero-Bixby, a demographer at the University of Costa Rica in San José. He used electoral records to work out how long Costa Ricans were living, and found that their life expectancy is surprisingly high. In general, people live longest in the world’s richest countries, where they have the most comfortable lives, the best health care and the lowest risk of infection. But that wasn’t the case here. Read More
It might not just be expectant mothers who have to pay attention to their lifestyle. Now a new study published in Science could be relevant to a growing body of research looking at ways in which the lifestyle and environment of men before they become fathers could influence the lives of their children and grandchildren.
We know that many human traits, such as weight, height, susceptibility to disease, longevity or intelligence, can be partly inherited, but researchers have so far struggled to identify the precise genetic basis for this. This may partly be due to limitations in our understanding of how genetics works, but now there is growing interest in the potential for something called “epigenetics” to explain this heritability. Read More
Human genetic engineering is not new; it has been going on for a long, long time — naturally. Ancient viruses are really good at inserting themselves and modifying human gene code. Over millennia, constant infections would come to mean that 8 percent of the entire human genome is made up of inserted virus code. All this gene recoding of our bodies occurred under Darwin’s rules, natural selection and random mutation. But nonrandom, deliberate human genetic engineering is new, and it is a big deal.
As of 1990, increasingly genetically modified humans walk among us. More and more gene therapies carry new instructions into our bodies and place them in the right spots; in so doing, they modify our most fundamental selves, our core, heretofore slow-evolving DNA. We are still in the very early stages of effectively hijacking viruses for human-driven purposes; just a few years ago it took a long time to identify and isolate a single faulty gene and figure out what was wrong, never mind finding a way to replace it with a properly functioning alternative. Early gene therapy focused on obscure, deadly orphan diseases like ADA-SCID (the immune disease that “Bubble Boy” had), adrenoleukodystrophy (say that five times fast), Wiskott-Aldrich syndrome, various leukemias, and hemophilia.
In theory the technique is relatively simple: Take a neutered virus, one that is engineered to not harm you but that readily infects human cells to ferry in new DNA instructions, write a new set of genetic instructions into the virus, and let it loose to infect a patient’s cells. And ta‑da! You have a genetically modified human. (Think of this as deliberately sneezing on someone but instead of giving them a cold, you give them a benign infection that enters their body, recodes their cells, and fixes a faulty gene.)
Mixed breed. Mongrel. Roadside setter. A something-something. Dogs of uncertain provenance get called a lot of things. When the animal arrives at a shelter, staff usually can make only an educated guess about the dog’s parentage.
Most of the dogs at my local animal control are assessed as “pit mixes” upon arrival — including the three I’ve adopted over the past 2 years. But a pit bull isn’t a breed: it’s just a type of dog characterized by a short coat, muscular frame and broad, oversized head.
All three of my dogs clearly — at least to my eyes — showed signs of specific breeds somewhere in their heritage: Tall and snow white Pullo looks like the breed standard for an American Bulldog. Tyche’s body is svelte like a boxer’s and inky black like some Labs. And lanky, long-limbed Waldo sometimes bays like a hound, especially when treeing squirrels.
Guessing my dogs’ breeds was a fun parlor game, but I wanted more definitive answers. So I turned to science. And, well, let’s just say it’s a good thing I didn’t place any bets on what was in my dogs’ family trees.
A version of this article originally appeared at The Conversation.
There could be a way of predicting – and preventing – which children will go on to have low intelligence, according to the findings of a study researchers at Cardiff University presented on Monday. They discovered that children with two copies of a common gene (Thr92Ala), together with low levels of thyroid hormone are four times more likely to have a low IQ. This combination occurs in about 4% of the UK population.
Importantly, if you had just one of these factors, but not both, there did not appear to be an increased risk of low intelligence. These are early results, but suggest that it might be possible to treat children early with thyroid hormone supplementation to enhance their intelligence. This raises many ethical issues.
A common objection is that being smarter does not make your life better. In this study, researchers were concerned with those with an IQ between 70-85. Below 70 is classified as intellectual disability but an IQ of 70 to 75 is similar to mild intellectual disability.
Even for individuals with an IQ between 75 and 90 there are still significant disadvantages. Job opportunities tend to be the least desirable and least financially rewarding, requiring significant oversight. More than half the people with this IQ level fail to reach the minimum recruitment standards for the US military. Individuals with this lower level of intelligence are at significant risk of living in poverty (16%), being a chronic welfare dependent (17%) and dropping out of school (35%) compared to individuals with average intelligence. Studies show that they also face an increased risk of incarceration and being murdered.
Linda Gottfredson, who’s undertaken much of this research, concludes that at the very least, “an IQ of 75 is perhaps the most important threshold in modern life”. So it is clear that those of low-normal intelligence, although not classified as disabled, are significantly disadvantaged.
If we could enhance their intelligence, say with thyroid hormone supplementation, we should.
By Eliza Strickland
What can you learn from getting your genome sequenced? If you’re a relatively healthy person like me, the answer is, not much… at least not yet.
I embarked on a mission to get myself sequenced for my recent article “The Gene Machine and Me.” The article focused on the sequencing technology that will soon enable a full scan of a human genome for $1000, and to make the story come alive, I decided to go through the process myself. I got my DNA run through the hottest new sequencing machine, the Ion Proton, and had it analyzed by some of the top experts on genome sequencing, a team at Houston’s Baylor College of Medicine.
The Baylor team has been intimately involved in many of the most important advances of genome sequencing over the last decade. And their accomplishments reveal both the astoundingly rapid progress of the technology, and how far we have yet to go. Here’s a synopsis: the story of five genomes.
Carrie Arnold is a freelance science writer in Virginia. She blogs about the science of eating disorders at www.edbites.com, and frequently covers microbiology topics for national magazines.
Conservationists like to think large. Whether it’s identifying hundreds of square miles of Himalayan highlands as a tiger corridor or creating massive marine preserves, these scientists are definitely thinking on the macro scale.
However a small but growing group of scientists are beginning to think smaller when it comes to conservation—much smaller. They have begun to study the microbes living in the soil, and their results are showing just how important microscopic life is in the macrobiotic world. A healthy, diverse population of soil microbes results in a healthy, diverse ecosystem. Changing an ecosystem also changes its microbes, scientists have found, and this may permanently scar the environment.
“Soil is not sterile,” said Noah Fierer, a microbiologist at the University of Colorado at Boulder. “These microbes are crucial to maintaining soil fertility.”
A new toxicology study states that rats eating genetically modified food and the weedkiller Roundup develop huge tumors and die. But many scientists beg to differ, and a close look at the study shows why.
Genetically modified organisms (GMOs) have always been a controversial topic. On the one hand are the many benefits: the higher crop yields from pesticide- and insect-resistant crops, and the nutritional modifications that can make such a difference in malnourished populations. On the other side is the question that concerns many people: We are modifying the genes of our food, and what does that mean for our health? These are important question, but the new study claiming to answer them misses the mark. It has many horrifying pictures of rats with tumors, but without knowledge about the control rats, what do those tumors mean? Possibly, nothing at all.
The recent study, from the Journal of Food and Chemical Toxicology has fueled the worst fears of the GMO debate. The study, by Italian and French groups, evaluated groups of rats fed different concentrations of maize (corn) tolerant to Roundup or Roundup alone, over a two year period, the longest type of toxicology study. (For an example of one performed in the U.S., see here.) The group looked at the mortality rates in the aging rats, as well as the causes of death, and took multiple samples to assess kidney, liver, and hormonal function.
The presented results look like a toxicologist’s nightmare. The authors reported high rates of tumor development in the rats fed Roundup and the Roundup-tolerant maize. There are figures of rats with visible tumors, and graphs showing death rates that appear to begin early in the rats’ lifespan. The media of course picked up on it, and one site in particular has spawned some reports that sound like mass hysteria. It was the first study showing that genetically modified foods could produce tumors at all, let alone the incredibly drastic ones shown in the paper.
Sophie Bushwick (Twitter, Tumblr) is a science journalist and podcaster, and is currently an intern at DISCOVERmagazine.com. She has written for Scientific American, io9, and DISCOVER, and has produced podcasts for 60-Second Science and Physics Central.
Human chromosomes (grey) capped by telomeres (white)
U.S. Department of Energy Human Genome Program
Renowned biologist Elizabeth Blackburn has said that when she was a young post-doc, “Telomeres just grabbed me and kept leading me on.” And lead her on they did—all the way to the Nobel Prize in Medicine in 2009. Telomeres are DNA sequences that continue to fascinate researchers and the public, partially because people with longer telomeres tend to live longer. So the recent finding that older men father offspring with unusually lengthy telomeres sounds like great news. Men of advanced age will give their children the gift of longer lives—right? But as is so often the case in biology, things aren’t that simple, and having an old father may not be an easy route to a long and healthy life.
Every time a piece of DNA gets copied, it can end up with errors in its sequence, or mutations. One of the most frequent changes is losing scraps of information from each end of the strand. Luckily, these strands are capped with telomeres, repeating sequences that do not code for any proteins and serve only to protect the rest of the DNA. Each time the DNA makes a copy, its telomeres get shorter, until these protective ends wear away to nothing. Without telomeres, the DNA cannot make any more copies, and the cell containing it will die.
But sperm are not subject to this telomere-shortening effect. In fact, the telomeres in sperm-producing stem cells not only resist degrading, they actually grow. This may be thanks to a high concentration of the telomere-repairing enzyme telomerase in the testicles; researchers are still uncertain. All they know is that the older the man, the longer the telomeres in his sperm will be.
By Luke Jostins, a postgraduate student working on the genetic basis of complex autoimmune diseases. Jostins has a strong background in informatics and statistical genetics, and writes about genetic epidemiology and sequencing technology on the his blog Genetic Inference. A different version of this post appeared on the group blog Genomes Unzipped.
One of the great hopes for genetic medicine is that we will be able to predict which people will develop certain diseases, and then focus preventative measures to those at risk. Scientists have long known that one of the wrinkles in this plan is that we will only rarely be able to say with certainty whether someone develop a given disease based on their genetics—more often, we can only give an estimate of their disease risk.
This realization came mostly from twin studies, which look at the disease histories of identical and non-identical twins. Twin studies use established models of genetic risk among families and populations, along with the different levels of similarity of identical and non-identical twins, to estimate how much of disease risk comes from genetic factors and how much comes from environmental risk factors. (See this post for more details.) There are some complexities here, and the exact model used can change the results you get, but in general the overall message is the same: genetic risk prediction contains a lot of information, but not enough to give guaranteed predictions of who will and who won’t get certain diseases. This is not only true of genetics either: parallel studies of environmental risk factors usually reveal tendencies and probabilities, not guarantees.
This means that two people with exactly the same weight, height, sex, race, diet, childhood infection exposures, vaccination history, family history, and environmental toxin levels will usually not get the same disease, but they are far more likely to than two individuals who differ in all those respects. To take an extreme example, identical twins, despite sharing the same DNA, socioeconomic background, childhood environment, and (generally) placenta, usually do not die from the same thing—but they are far more likely to than two random individuals. This is a perfect analogy for how well (and badly) risk prediction can work: you will never have a better prediction than knowing the health outcomes of a genetic copy of you. The health outcomes of another version of you will be invaluable, and will help guide you, your doctor, and the health-care establishment, if they use this information properly. But it won’t let them know exactly what will happen to you, because identical twins usually do not die from the same thing.
There is no health destiny: There is always a strong random component in anything that happens to your body. This does not mean that none of these things are important; being aware of your disease risks is one of the most important things you can do for your own future health. But risk is not destiny. And this central fact has been well known to scientists for a while now.
This was the context into which a recent paper in Science Translational Medicine by Bert Vogelstein and colleagues was published, which also used twin study data to ask how well genetics could predict disease. The take-home message from the study (or at least the message that many media outlets have taken home) is that DNA does not perfectly determine which disease or diseases you may get in the future. The paper was generally pretty flawed: many geneticists expressed annoyance at the paper, and Erika Check Hayden carried out a thorough investigation into the paper for the Nature News blog. In short, the study used a non-standard and arbitrary model of genetic risk, and failed to properly model the twin data, handling neither the many environmental confounders nor the large degree of uncertainty associated with studies of twins.
Many geneticists were annoyed that the authors seemed to be unaware of the existing literature on the subject, and that they presented their approach and their results as if they were novel and controversial at a well-attended press release at the American Association for Cancer Research annual meeting. However, what came as more of a shock was how surprised the media as a whole seemed to be at the results, with headlines such as “DNA Testing Not So Potent for Prevention” and “Your DNA blueprint may disappoint.” No reporter (other than Erika) even mentioned the information that we already had about the limits of genetic risk prediction. As Joe Pickrell pointed out on twitter, we can’t really know whether this was genuine surprise or merely newspapers hyping the message to make it seem more like news, but having talked to a few journalists and members of the public, the surprise appears to be at least in part genuine. The gap between the public perception and the established consensus on genetic risk prediction seemed to us to be unexpected and worrying.