The week before last, I spent several delightful days at the Causality, Analyticity, and Superluminal Propagation Workshop at the Michigan Center for Theoretical Physics in Ann Arbor. This very general title could, in principle, cover all of the many reasons that physicists have considered the possibility of faster than light travel and its implications during the century since the development of special relativity. However, in reality, the workshop was designed to be predominantly focused on a specific subset of such questions.
Over the past decade or so, the quest to understand the class of models that can drive the accelerated expansion of the early universe – inflation – has been joined by the equally, if not more complicated challenge of explaining our observed epoch of late time acceleration. Approaches to these two problems vary from phenomenological proposals for new mass energy sources to drive acceleration (the inflaton, dark energy) and the analogous attempts to modify the Einstein-Hilbert action for general relativity to allow for self-acceleration (modified gravity) to searches for an origin of either phenomenon within a complete theory of matter and gravity, such as string theory.
Some of these models have encountered rather serious theoretical problems, such as the existence of ghosts – fields in the theory that in effect have negative kinetic energy and typically signal catastrophic instabilities in the system. These arise, for example, in a number of modified gravity approaches to cosmic acceleration. But there is another theoretical issue that can appear which, while it raises an immediate warning flag, requires some careful work to understand the extent to which it constrains possible models. This was the main subject of the workshop.
Lorentz invariance describes the symmetry on which special relativity is built, and underlies quantum field theory and general relativity. These theories are remarkably successful and one tends to think of causality as being built-in and inviolable due to the light-cone structure provided by Lorentz invariance. This means that one expects that information cannot be transmitted to a given event from any events separated from it by faster than light travel (i.e. outside of the past light cone of the event.) Given the success of Lorentz invariant theories, we are loath to consider throwing them away (although if one has a good idea about how to do this sensibly, it would be fascinating).
However, the study of models of cosmic acceleration has focused attention on a class of field theories and modifications to gravity with Lorentz invariant actions, which nevertheless allow for superluminal propagation of perturbations on nontrivial backgrounds. The central example of these theories is provided by the so-called K-essence models, in which the Lagrangian density is a general function of a scalar field and its Lorentz invariant kinetic term. These are typically higher derivative theories, to which some of one’s typical intuition does not apply. Interesting phenomena can occur when the background breaks Lorentz invariance and the kinetic terms have a specific structure. This implies consequences for cosmology, black holes and high-energy physics. A nice discussion of all this can be found in a paper from last year
Eugeny Babichev, Viatcheslav Mukhanov, Alexander Vikman
The k-essence theories admit in general the superluminal propagation of the perturbations on classical backgrounds. We show that in spite of the superluminal propagation the causal paradoxes do not arise in these theories and in this respect they are not less safe than General Relativity.
the authors of which were all present at this workshop.
As one might expect, something like superluminal propagation immediately raises the suspicion that the relevant theory may be fundamentally sick, and thus it is extremely important to investigate whether such theories are free of internal inconsistencies at both the classical and quantum level. This was the focus of the Michigan workshop.
Workshops are tricky things to get right. Too many people and too many talks, and it is really a conference, which can be great in its own right, but which typically lacks the intense group discussions and brainstorming environment for which one strives with a workshop. Too few people, or too few who are prepared to engage in open-ended and open-minded discussions, and the conversations stall. I’ve been to some real disasters on this last front in particular.
But last week’s workshop was just about right. There was a relatively small, but very interactive group of invited participants, interested in aspects of the topic for a variety of different reasons, and with quite different approaches. The talks I saw (I wasn’t there for the whole workshop) were quite good (I particularly enjoyed a very clear and fun one from Andrei Gruzinov), but weren’t the highlight for me. What I really enjoyed were the long and ranging discussions after the day’s formal program had ended. In particular, the two hours at the blackboard with Andrei, Cedric Deffayet, Ken Olum, Vitaly Vanchurin, Alex Vikman and Richard Woodard, spent discussing both constraints on specific theories, and the possibility of proving some general theorems about such complicated systems was incredibly stimulating.
The hallmark of a successful workshop is not that some fundamental problem is resolved during it. Rather it is that one leaves full of ideas, feeling that one understands enough more about the topic to have made the trip worthwhile. Both of these are true about the Michigan workshop. In fact, I left with three or four new ideas that I now need to find time to work on, and perhaps the right Ph.D. students to include. Thanks to Ratindranath Akhoury and Alex Vikman for organizing such a stimulating meeting!