Over the last few months (and this will certainly continue over the next few years) I have been spending some time boning up on particle physics phenomenology and the associated model-building issues. Part of my research involves investigating the cosmological implications of such models, while at other times I am interested in how certain outstanding cosmological questions might be addressed by new particle physics beyond the standard model. These, plus the upcoming turn-on of the Large Hadron Collider (LHC), are some of the reasons that I have been spending time on phenomenology.
I’ve been thinking about this particularly today after a nice seminar by Ian Low from the Institute for Advanced Study (IAS) in Princeton. The content of Ian’s seminar isn’t really what I want to discuss here, but part of what he spoke about got me thinking about a question I’ve wanted to get into for a while.
Most models of physics Beyond the Standard Model (BSM) are motivated by one of the outstanding problems of particle physics – the hierarchy problem. This is the problem of reconciling two wildly disparate mass scales; the weak scale (102 GeV) and the Planck scale (1019 GeV). This hierarchy is technically unnatural in particle physics, since, in general, the effect of quantum mechanics (here known as renormalization) is to make the observable values of such scales much closer in size.
For example, one approach is to introduce a mechanism that cancels many of the quantum corrections, allowing the scales to remain widely separated even after quantum mechanics is taken into account. An example of such a mechanism (and the most popular one, for sure) is supersymmetry (SUSY) with TeV-scale SUSY breaking.
Another perspective is to view the hierarchy problem no longer as a disparity between mass scales, but rather as an issue of length scales, or volumes. The general hypothesis is that the universe as a whole is 3+1+d dimensional (so that there are d extra, spatial dimensions), with gravity propagating in all dimensions, but the standard model fields confined to a 3+1 dimensional submanifold that comprises our observable universe. This submanifold is called the brane (as in membrane). The volume of the extra dimensions can be large, and the spreading of gravitational flux into this volume allows gravity measured on our brane to be so weak, parameterized by the Planck mass, while the fundamental scale of physics is parameterized by the weak scale.
Beyond the Standard Model ideas such as these have the added bonus of a natural connection with dark matter, since the new particles and symmetries that are introduced at the TeV scale typically yield a natural Weakly Interacting Massive Particle (WIMP) candidate.
In the last couple of years, a number of authors have begun exploring models of BSM physics that are unconstrained by addressing naturalness issues, and instead are guided only by requiring gauge unification and a dark matter candidate. The motivation for such models arises from considerations of the string landscape, but I categorically do not want to get into that in this post, or in the comments thread, since it has been discussed to death in many, many other threads. Another motivation that is often mentioned is that current collider constraints are pushing even low-scale SUSY models to need some fine-tuning when addressing the hierarchy problem.
An example of this kind of model is given by Split Supersymmetry (see here and here). In these models, since naturalness is abandoned, SUSY is broken at a high scale and the scalar superparticles (and the Higgs) become extremely heavy. It is arranged, however, for the fermions to remain light, so that they help with unification and one of them can serve as a dark matter candidate.
There exists a considerable literature on the collider signatures of this model and a great deal of follow-up work exploring other consequences. Unfortunately I cannot pretend to have read more than a small fraction of these papers and so certainly can’t comment on them.
As part of my continuing phenomenology education, I thought it might be interesting to have a discussion on the various pros and cons of the two broad approaches to BSM model building. I must confess up front that, so far, I haven’t found the newer approach particularly compelling. Beyond the obvious issue of abandoning naturalness, I think I prefer to have dark matter emerge as an output of the particle physics model, rather than an input. Nevertheless, while I am obviously very close to a lot of this material, I am not one of the experts on these models, and I am sincere when I say that I would be interested in a constructive pedagogical discussion of the pros and cons of the approaches. I guarantee that there are subtleties (and perhaps big glaring issues) that I am missing.
I realize I can’t enforce this, but, as mentioned above, I’d like to suggest a ground rule for the discussion. I don’t think there is anything to gain by rehashing the string landscape issues here. It is not what I intend, and we really have gone over it again and again before.
So, with this one caveat, please have at it. What are the pros and cons of BSM models constructed with naturalness in mind and those constructed ignoring naturalness considerations?