Christina Agapakis is a synthetic biologist and postdoctoral research fellow at UCLA who blogs about about biology, engineering, biological engineering, and biologically inspired engineering at Oscillator.
When you factor in the fertilizer needed to grow animal feed and the sheer volume of methane expelled by cows (mostly, though not entirely, from their mouths), a carnivore driving a Prius can contribute more to global warming than a vegan in a Hummer. Given the environmental toll of factory farming it’s easy to see why people get excited about the idea of meat grown in a lab, without fertilizer, feed corn, or burps.
In this vision of the future, our steaks are grown in vats rather than in cows, with layers of cow cells nurtured on complex machinery to create a cruelty-free, sustainable meat alternative. The technology involved is today used mainly to grow cells for pharmaceutical development, but that hasn’t stopped several groups from experimenting with “in vitro meat,” as it’s called, over the last decade. In fact, a team of tissue engineers led by professor Mark Post at Maastricht University in the Netherlands recently announced their goal to make the world’s first in vitro hamburger by October 2012. The price tag is expected to be €250,000 (over $330,000), but we’re assured that as the technology scales up to industrial levels over the next ten years, the cost will scale down to mass-market prices.
Whenever I hear about industrial scaling as a cure-all, my skeptic alarms start going off, because scaling is the deus ex machina of so many scientific proposals, often minimized by scientists (myself included) as simply an “engineering problem.” But when we’re talking about food and sustainability, that scaling is exactly what feeds a large and growing population. Scaling isn’t just an afterthought, it’s often the key factor that determines if a laboratory-proven technology becomes an environmentally and economically sustainable reality. Looking beyond the hype of “sustainable” and “cruelty-free” meat to the details of how cell culture works exposes just how difficult this scaling would be.
By Gary Taubes, author of Nobel Dreams (1987), Bad Science (1993), Good Calories, Bad Calories (2007), and Why We Get Fat (2011). Taubes is a former staff member at DISCOVER. He has won the Science in Society Award of the National Association of Science Writers three times and was awarded an MIT Knight Science Journalism Fellowship for 1996-97. A modified version of this post appeared on Taubes’ blog.
The last couple of weeks have witnessed a slightly-greater-than-usual outbreak of extremely newsworthy nutrition stories that could be described as bad journalism feasting on bad science. The first was a report out of the Harvard School of Public Health that meat-eating apparently causes premature death and disease (here’s how the New York Times covered it), and the second out of UC San Diego suggesting that chocolate is a food we should all be eating to lose weight (the Times again).
Both of these studies were classic examples of what is known technically as observational epidemiology, a field of research I discussed at great length back in 2007 in a cover article for in the New York Times Magazine. The article was called “Do We Really Know What Makes Us Healthy?” and I made the argument that this particular pursuit is closer to a pseudoscience than a real science.
As a case study, I used a collaboration of researchers from the Harvard School of Public Health, led by Walter Willett, who runs the Nurses’ Health Study. And I pointed out that every time that these Harvard researchers had claimed that an association observed in their observational trials was a causal relationship—that food or drug X caused disease or health benefit Y—and that this supposed causal relationship had then been tested in experiment, the experiment had failed to confirm the causal interpretation—i.e., the folks from Harvard got it wrong. Not most times, but every time.
Now it’s these very same Harvard researchers—Walter Willett and his colleagues—who have authored the article from two weeks ago claiming that red meat and processed meat consumption is deadly; that eating it regularly raises our risk of dying prematurely and contracting a host of chronic diseases. Zoe Harcombe has done a wonderful job dissecting the paper at her site. I want to talk about the bigger picture (in a less concise way).
This is an issue about science itself and the quality of research done in nutrition. Science is ultimately about establishing cause and effect. It’s not about guessing. You come up with a hypothesis—force x causes observation y—and then you do your best to prove that it’s wrong. If you can’t, you tentatively accept the possibility that your hypothesis might be right. In the words of Karl Popper, a leading philosopher of science, “The method of science is the method of bold conjectures and ingenious and severe attempts to refute them.” The bold conjectures, the hypotheses, making the observations that lead to your conjectures… that’s the easy part. The ingenious and severe attempts to refute your conjectures is the hard part. Anyone can make a bold conjecture. (Here’s one: space aliens cause heart disease.) Testing hypotheses ingeniously and severely is the single most important part of doing science.
The problem with observational studies like the ones from Harvard and UCSD that gave us the bad news about meat and the good news about chocolate, is that the researchers do little of this. The hard part of science is left out, and they skip straight to the endpoint, insisting that their causal interpretation of the association is the correct one and we should probably all change our diets accordingly.