A different version of this story appears at The Scientist.
Honeybee workers spend their whole lives toiling for their hives, never ascending to the royal status of queens. But they can change careers. At first, they’re nurses, which stay in the hive and tend to their larval sisters. Later on, they transform into foragers, which venture into the outside world in search of flowers and food.
This isn’t just a case of flipping between tasks. Nurses and foragers are very distinct sub-castes that differ in their bodies, mental abilities, and behaviour – foragers, for example, are the ones that use the famous waggle dance. “[They’re] as different as being a scientist or journalist,” explains Gro Amdam, who studies bee behaviour. “It’s really amazing that they can sculpt themselves into those two roles that require very specialist skills.” The transformation between nurse and forager is significant, but it’s also reversible. If nurses go missing, foragers can revert back to their former selves to fill the employment gap.
Amdam likens them to the classic optical illusion (shown on the right) which depicts both a young debutante and an old crone. “The bee genome is like this drawing,” she says. “It has both ladies in it. How is the genome able to make one of them stand out and then the other?
The answer lies in ‘epigenetic’ changes that alter how some of the bees’ genes are used, without changing the underlying DNA. Amdam and her colleague Andrew Feinberg found that the shift from nurse to forager involves a set of chemical marks, added to the DNA of few dozen genes. These marks, known as methyl groups, are like Post-It notes that dictate how a piece of text should be read, without altering the actual words. And if the foragers change back into nurses, the methylation marks also revert.
Together, they form a toolkit for flexibility, a way of seeing both the crone and the debutante in the same picture, a way of eking out two very different and reversible skill-sets from the same genome.
This week, I’m filling in for one of the Guardian’s science correspondents (the excellent Alok Jha), and I’ve been asked to cover a few stories that I would otherwise do here. So, let me direct you to the Guardian website where you’ll find the two pieces I filed today.
Imagine trying to rewind the clock and start your life anew, perhaps by moving to a new country or starting a new career. You would still be constrained by your past experiences and your existing biases, skills and knowledge. History is difficult to shake off, and lost potential is not easily regained. This is a lesson that applies not just to our life choices, but to stem cell research too.
Over the last four years, scientists have made great advances in reprogramming specialised adult cells into stem-like ones, giving them the potential to produce any of the various cells in the human body. It’s the equivalent of erasing a person’s past and having them start life again.
But a large group of American scientists led by Kitai Kim have found a big catch. Working in mice, they showed that these reprogrammed cells, formally known as “induced pluripotent stem cells” or iPSCs, still retain a memory of their past specialities. A blood cell, for example, can be reverted back into a stem cell, but it carries a record of its history that constrains its future. It would be easier to turn this converted stem cell back into a blood cell than, say, a brain cell.
The history of iPSCs is written in molecular marks that annotate its DNA. These ‘epigenetic’ changes can alter the way a gene behaves even though its DNA sequence is still the same. It’s the equivalent of sticking Post-It notes in a book to tell a reader which parts to read or ignore, without actually editing the underlying text. Epigenetic marks separate different types of cells from one another, influencing which genes are switched on and which are inactivated. And according to Kim, they’re not easy to remove, even when the cell has apparently been reprogrammed into a stem-like state.
Pregnant women are generally advised to avoid drinking alcohol and for good reason – exposing an unborn baby to alcohol can lead to a range of physical and mental problems from hyperactivity and learning problems to stunted growth, abnormal development of the head, and mental retardation.
But alcohol also has much subtler effects on a foetus. Some scientists have suggested that people who get their first taste of alcohol through their mother’s placenta are more likely to develop a taste for it in later life. This sleeper effect is a long-lasting one – exposure to alcohol in the womb has been linked to a higher risk of alcohol abuse at the much later age of 21. In this way, mums could be inadvertently passing down a liking for booze to their children as a pre-birthday present.
Now, Steven Youngentob from SUNY Upstate Medical University and Jon Glendinning from Columbia University have found out why this happens. By looking at boozing rats, they have found that those first foetal sips of alcohol make the demon drink both taste and smell better.
The duo raised several pregnant rats on diets of either chow, liquids or liquids that had been spiked with alcohol. The third group eventually had a blood alcohol concentration of about 0.15%, a level that would cause a typical human to slur, stagger or become moody.
When the females eventually gave birth, month-old pups born to boozy mothers were more likely to lick an alcohol-coated feeding tube than those whose mothers were tee-total. These rats had been born with more of a taste for booze.
The trauma of child abuse can last a lifetime, leading to a higher risk of anxiety, depression and suicide further down the line. This link seems obvious, but a group of Canadian scientists have found that it has a genetic basis.
By studying the brains of suicide victims, Patrick McGowan from the Douglas Mental Health University Institute, found that child abuse modifies a gene called NR3C1 that affects a person’s ability to deal with stress. The changes it wrought were “epigenetic”, meaning that the gene’s DNA sequence wasn’t altered but it’s structure was modified to make it less active. These types of changes are very long-lasting, which strongly suggests that the trauma of child abuse could be permanently inscribed onto a person’s genes.
Child abuse, from neglect to physical abuse, affects the workings of an important group of organs called the “hypothalamic-pituitary-adrenal axis” or HPA. This trinity consists of the hypothalamus, a funnel-shaped part of the brain; the pituitary gland, which sits beneath it; and the adrenal glands, which sit above the kidneys. All three organs secrete hormones. Through these chemicals, the HPA axis controls our reactions to stressful situations, triggering a number of physiological changes that prime our bodies for action.
The NR3C1 gene is part of this system. It produces a protein called the glucocorticoid receptor, which sticks to cortisol, the so-called “stress hormone”. Cortisol is produced by the adrenal glands in response to stress, and when it latches on to its receptor, it triggers a chain reaction that deactivates the HPA axis. In this way, our body automatically limits its own response to stressful situations.
Without enough glucocorticoid receptors, this self-control goes awry, which means that the HPA is active in normal situations, as well as stressful ones. No surprise then, that some scientists have found a link between low levels of this receptor and schizophrenia, mood disorders and suicide. So, childhood trauma alters the way the body reacts to stress, which affects a person’s risk of suicide or mental disorders later in life. Now, McGowan’s group have revealed part of the genetic (well, epigenetic) basis behind this link.
Many measures to curb the obesity epidemic are aimed at young children. It’s a sensible strategy – we know that overweight children have a good chance of becoming overweight adults. Family homes and schools have accordingly become critical arenas where the battle against the nation’s growing waistlines is fought. But there is another equally important environment that can severely affect a person’s chances of becoming overweight, but is more often overlooked – the womb.
Overweight parents tend to raise overweight children but over the last few years, studies have confirmed that this tendency to transcend generations isn’t just the product of a shared home environment. Obesity-related genes are involved too, but even they aren’t the whole story. Research has shown that a mother’s bodyweight in the period during and just before pregnancy has a large influence on the future weight of her children.
For example, children born to mothers who have gone through drastic weight-loss surgery (where most of the stomach and intestine are bypassed) are half as likely to be obese themselves. On the other hand, mothers who put on weight between two pregnancies are more likely to have an obese second child. In this way, the obesity epidemic has the potential to trickle down through the generations, like a snowball rolling its way into an avalanche.
Now, Robert Waterland from the Baylor College of Medicine has demonstrated how the snowball gains momentum by studying three generations of mice that have a genetic tendency to overeat. And using a special diet that was high in folate and other nutrients, he found that he could stop the snowball’s descent and spare future generations of mice from a heightened risk of obesity.