Andy Albrecht gave a very nice colloquium at Chicago last Wednesday about the Dark Energy Task Force report, the final draft of which was supposed to be available on Friday — such a nice colloquium that I actually think I caught a good fraction of what he talked about and felt like I should pass it on. The basic charge of the DETF was to report back to NSF, NASA & DOE with a summary of the proposed approaches for studying dark energy, characterized their relative merits, identify steps required to get there and evaluate how well proposed projects and approaches will do in sorting this mess out.
Here’s a summary of the report based on what I gleaned from Andy’s talk — but you should all go read the thing yourselves if you’re interested in such things, I make no promise of accuracy or completeness. There’s been lots of talk before on this blog about dark energy and why it’s interesting and why it’s perplexing and what it might teach us about fundamental physics, so I won’t get into that here. But I think cosmologists and particle physicists generally agree that the fact that we have so little understanding of the primary constituent of the energy density of the Universe is one of the most important questions in all of science.
The panel, chaired by Rocky Kolb, was composed of a pretty distinguished crew, both theorists and experiementalists and observers that specialize in the various techniques that have been proposed to measure something about the equation of state of the Universe (Albrecht, Gary Bernstein, Bob Cahn, Wendy Freedman, Jackie Hewitt, Wayne Hu, John Huth, Lloyd Knox, Mark Kamionkowsky, John Mathur, Suzanne Staggs, Nick Suntzeff). They clearly took their job pretty seriously. Over a period of many months, they had weekly phonecons, several meetings, and seem to have actually done lots of calculations. Now, I’m young enough that I don’t know the varied history of task force reports, but I was certainly impressed to see that in addition to ruminating on various things, they clearly actually wrote some code, and produced a lot of interesting numbers from it 😉
As Andy stated it (and I think this is pretty much right), the state of the field was that it was full of conflicting claims about what various projects could do and about the relative merits of various approaches and lacked any way to compare these approaches with any kind of standard. This really made it hard to see clearly or to evaluate what the best way forward was. So the first thing they tried to do was remedy this.
Here’s how the panel framed the issues:
The eventual goal should be to understand the nature of dark energy, but of course that goal is likely a long way away. In the meantime, we can hope to make progress in a few stages. The first thing to note is that the the effect of dark energy is characterized by an equation of state for the Universe, w(a) = P(a)/rho(a) (“a” refers to the scale factor of the Universe, P to the presure, and rho to the energy density). There is now pretty conclusive evidence that the Universe is accelerating, which is true for w < -1/3. In this framework, one can then ask:
- is w(a) constant?
- what are the equation of state parameters w_0 and w_a,
as specified by w = w_0 + w_a(1-a)
- is there evidence for changes in the nature of general relativity?
They then evaluated what we might learn about these issues in 4 stages:
1) what is known now
2) what will be known upon completion of existing projects
3) medium term projects, on a 5 year time scale, costing ~10’s of millions
eg. the Dark Energy Survey
4) long term projects, ~10 year, costing ~0.3-1 billion
(e.g., LST, JDEM, SKA)
They then chose one figure of merit for evaluating the worth of a given project or method, the inverse area of the error ellipse enclosing 95% confidence in the w0-wa plane. This particular choice of this certainly doesn’t seem unambiguous, but it does seem like one of the most natural way to evaluate it.
The primary observables in the Universe that can say something about the equation of state are the distance-redshift relation D(z), which has a one-to-one relation with w(a), and the growth factor, g(z), which has a one-to-one relation with D(z), if GR is correct. So if one can measure both of them independently it not only provides some independent measures of w(a), but it allows a test of GR and our standard cosmological model on very large scales. Note that by “large”, I mean really large, roughly half the “length” of the Universe; the base line over which most of these methods operate is about z~0 to z~1, which is roughly half the age of the Universe.
There are four main probes of DE that the panel considered, and on which most of the large project work towards constraining DE is focused:
- Type Ia Supernovae, which provide a measure of D(z).
- Baryon acoustic oscillations, a standing wave pattern on ~ 140 Mpc scales, which depends only on timescales in the early Universe and the speed of sound in the baryon-photon plasma, and provides a standard ruler and thus a measure of D(z).
- The abundances of galaxy clusters, which probes the mass function of dark matter halos. The evolution of this abundance probes both g(z) and D(z), but is sensitive to uncertainties in the mass-observable relation
- Weak lensing, which uses measurements of the distortion of galaxy shapes to probe the mass distribution.
The panel assesed these various methods in light of the four stages previously discussed. A few of their basic conclusions about the relative merits and how well one will be able to do with various methods:
- Measuring D(z) with SN is a relatively mature method, limited primarily by the accuracy of photometric redshifts.
- The baryon oscillation method is relatively young, but is less affected by systematics than the other methods — because the length scale of the wiggles is not much affected by the bias of the galaxy population that its measured with.
- The panel concluded that cluster abundances had in principle higher potential than the previous two, but the primary uncertainty here is in understanding the relation between observables and halo mass, and it is not yet clear how well this will be able to be done. (This is one of the main things I’ve been working on with the Sloan Digital Sky Survey and the Dark Energy Survey, a “stage 3” project in the terms of the DETF, and I can certainly say that most of the assumptions in the literature and probably in various white papers that were contributed to the DETF are still pretty naive on this point).
- Weak lensing is a relatively new method, and has suffered from a lack of standardization that has made different measurements difficult to compare. The biggest noise here is in the intrinsic shape of galaxies. If the systematics can be controlled and understood, this method has the most potential standing on its own.
- From stage 3 projects (~5 year timescale), one can hope to get a factor of ~2 improvement over data from current surveys from individual methods, and a factor of 3-5 improvement if all four methods combined.
- From stage 4 projects, one can hope to get an order of magnitude improvement over data from current surveys using all of the methods.
An essential conclusion that the task force came to was that combining all four methods gets you a lot that none of the methods can do on their own. This is true partially because of the added intrinsic value and the differing systematics of the various methods, but also because the combination of growth and distance tests can provide a fundamental test of general relativity. A second important conclusion of the task force was that stage 3 projects should really target improved and improved understanding of systematics. Because most of the stage 3 and 4 projects depend on large photometric surveys without full spectroscopic followup, understanding and minimizing systematic uncertainties in the photo-zs is one of the most essential for dark energy constraints. I’m currently in Barcelona at a Dark Energy Survey collaboration meeting, and I can tell you that how to do this best is one of the primary things being discussed.
My understanding is that the final version of the report came out last Friday, but I’m not sure where “out” means, and I haven’t had a chance to look. I’m sure one of our commenters can point us to a copy.