by John Hawks, an anthropologist at the University of Wisconsin—Madison who studies the genetic and environmental aspects of humanity’s 6-million-year evolution. This post ran in slightly different form on his own blog.
Philip Ball writes in The Guardian about another new initiative from NSF to fund “potentially transformative” research. He begins his essay with this:
The kind of idle pastime that might amuse physicists is to imagine drafting Einstein’s grant applications in 1905. “I propose to investigate the idea that light travels in little bits,” one might say. “I will explore the possibility that time slows down as things speed up,” goes another. Imagine what comments these would have elicited from reviewers for the German Science Funding Agency, had such a thing existed. Instead, Einstein just did the work anyway while drawing his wages as a technical expert third-class at the Bern patent office. And that is how he invented quantum physics and relativity.
The moral seems to be that really innovative ideas don’t get funded—that the system is set up to exclude them.
The system is set up to exclude really innovative ideas. But Einstein is a really misleading example. For one thing, Einstein didn’t need much grant funding for his research. Yes, if somebody had given the poor guy a postdoc, he might have had an easier time being productive in physics. But his theoretical work didn’t need expensive lab equipment, RA and postdoc salaries, and institutional overhead to fund secretarial support, building maintenance, and research opportunities for undergraduates.
It is a better question whether we would have wanted Einstein to spend 1905 applying for grants instead of publishing. But even this is terribly misleading. Most scientists who are denied grants are not Einstein. Most ideas that appear to be transformative in the end turn out to be bunk. Someone who compares himself to Einstein is overwhelmingly likely to be a charlatan. There should probably be a “No Einsteins need apply” clause in every federal grant program.
Setting aside the misleading Einstein comparison, our current grant system still has some severe problems. Is it selecting against “transformative” research—the big breakthroughs? I would put the problem differently. “Transformative” is in the eye of the beholder. Our grant system does what it has been designed for: it picks winners and losers, with a minimum of accountability for the people who set funding priorities.
Fermilab’s Tevatron, the largest particle accelerator in the United States, was shut down on September 30 after a celebrated career of 28 years that has provided us with some of the greatest discoveries in particle physics. This leaves the European lab CERN (see photo on left) to lead the way into future discoveries with its Large Hadron Collider.This landmark in experimental physics is an opportunity to reexamine the theoretical model physicists have constructed and relied on in their search to understand the workings of the universe: the standard model of particle physics. The standard model is a comprehensive theory about nature’s elementary particles and the forces that control their behavior, and it has been constructed over a half-century of intensive work by many theoretical physicists as well as experimentalists. The model has worked amazingly well, harmoniously combining theory and experiments and producing extremely accurate predictions about the behavior of particles and forces. But could the model now be beginning to show some cracks?
It all started on a wintry evening in 1928. While staring at the flames in the fireplace at St. John’s College, Cambridge, Paul Dirac made one of the most important discoveries in the history of science when he saw how to combine the Schrödinger equation of quantum mechanics with Einstein’s special (but not general) theory of relativity. This achievement launched relativistic quantum field theory—which forms the theoretical basis for the standard model—and produced two immediate consequences: an explanation of the spin of the electron, and Dirac’s stunning prediction of the existence of antimatter (confirmed a few years later with the discovery of the positron).
In the late 1940s, Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga, all working independently, presented the first quantum field theory, called quantum electrodynamics, which explained the electromagnetic interactions of electrons and photons. It forms the first part of the standard model by handling interactions that are controlled by the electromagnetic field. The theory’s success inspired other theoretical physicists to construct similar quantum field theories for addressing the actions of the weak and strong nuclear forces—thus together accounting for everything in particle physics except for the action of gravity, the subject of Einstein’s general theory of relativity. By the 1970s, the result, the standard model, was ready: a quantum field theory of all elementary particles—leptons and quarks and their interactions through the actions of particles (such as the photon) called bosons.
In 1917, a year after his general theory of relativity was published, Einstein tried to extend his field equation of gravitation to the universe as a whole. The universe as known at the time was simply our galaxy—the neighboring Andromeda, visible to the naked eye from very dark locations, was thought to be a nebula within our own Milky Way home. Einstein’s equation told him that the universe was expanding, but astronomers assured him otherwise (even today, no expansion is evident within the 2-million-light-year range to Andromeda; in fact, that galaxy is moving toward us). So Einstein inserted into his equation a constant now known as “lambda,” for the Greek letter that denoted it. Lambda, also called “the cosmological constant,” supplied a kind of force to hold the universe from expanding and keep it stable within its range. Then in 1929, Hubble, Humason, and Slipher made their monumental discovery using the 100-inch Mount Wilson telescope in California of very distant galaxies and the fact that they were receding from us—implying that the universe was indeed expanding, just as Einstein’s original equation had indicated! When Einstein visited California some time later, Hubble showed him his findings and Einstein famously exclaimed “Then away with the cosmological constant!” and never mentioned it again, considering lambda his greatest “blunder”—it had, after all, prevented him from theoretically predicting the expansion of the universe.
Fast forward six decades to the 1990s. Saul Perlmutter, a young astrophysicist at the Lawrence Berkeley Laboratory in California had a brilliant idea. He knew that Hubble’s results were derived using the Doppler shift in light. Light from a galaxy that is receding from us is shifted to the red end of the visible spectrum, while a galaxy that is approaching us has its light shifted to the blue end of the spectrum, from our vantage point. The degree of the shift is measured by a quantity astronomers call Z, which is then used to determines a galaxy’s speed of recession away from us (when Z is positive and shift is to the red).