Constraints and Signatures in Particle Cosmology

By Mark Trodden | June 26, 2007 7:09 am

If you are really lucky, then you may have a great new idea about particle physics. It may be a way to address the hierarchy problem (why is gravity so much weaker then the known particle physics forces), or to generate mass for fermions (after all, we haven’t found the Higgs yet), or to understand the flavor hierarchy (how come there are three repeated families of particles in the standard model with increasing masses), or perhaps to unify all the forces into one (Grand Unification). Obviously, your obligation is to begin systematically computing the consequences of this idea for existing and future particle physics experiments.

Thirty years or so ago, with a few notable exceptions this would have been the end of the story. But it has become increasingly clear to most physicists that there exists a complementary list of consequences that should be figured out; those for cosmology. These days, this approach is basically second nature to any of us who might have new ideas about how the micro-world works, and reflects the modern thinking that particle physics and cosmology are not distinct disciplines, but are two sides of the same set of questions.

So, parallel to the cross-section and decay rate calculations, what are the most common cosmological areas in which one currently looks for further constraints on one’s new particle physics idea? What new questions do you need to ask yourself?

  1. Does your theory contain any new long-lived elementary particles? If it does then you better watch out and you better beware. You see, such particles may quit interacting with other species in the relatively early universe (if they are weakly-coupled) and so maintain a rather high abundance as the universe cools. Because of this, a relatively straightforward calculation shows that they can rather quickly become the dominant contributor to the matter content of the universe. This can be a real disaster, given how much we know about the cosmic expansion history, and is to be avoided. The couplings, masses and lifetimes of such particles therefore need to be such that they either never dominate the energy budget of the universe, or make just the right contribution to be interesting (see my second list below).
  2. A related problem can arise if your theory contains long-lived particles that are too light, because if there are too many of them around when structure is trying to form, then because they are light they typically move at relativistic speeds and stream through overdense regions smoothing them out and ruining structure formation.
  3. Does your theory contain any new topological defects, such as monopoles, domain walls or cosmic strings?. If the vacuum structure of your particle physics theory is sufficiently topologically complex, then any symmetry breakings that occur may lead to trapped regions of false vacuum that cannot decay. If so, then many of the constraints mentioned for long-lived elementary particles may apply to these objects. In addition, some topological defects can form networks that redshift more slowly then matter, coming to dominate at a later time in the universe, or can generate a spectrum of gravitational radiation that is in conflict with our detailed measurements of the timing of the millisecond pulsar. If this last constraint is a problem, then it is also possible that the defects unacceptably distort the spectrum of the Cosmic Microwave Background radiation (CMB).
  4. In the early universe, does your theory significantly alter either the matter content or the expansion rate of the universe during the formation of the light elements – Big Bang Nucleosynthesis (BBN)? This can be an immediate death blow, since the remarkable agreement between measurements of the abundances of the light elements and those predicted within the standard cosmology is one of our triumphs and our earliest direct test of the Big Bang model.
  5. Going further back in time, does any of the new physics in your model lead to new sources for density (or metric) perturbations? If so, when you process these through cosmic history, what does the resulting spectrum of the CMB look like, and how does it correlate with the related prediction for the spectrum of large scale structure? What about the expected results of weak lensing studies? How do all of these compare with the wonderful data that has poured in over recent years?

If your big new idea passes all these tests (and others I haven’t mentioned) then you may really have something. If this is all there is to it, then you can be happy that your new construction gives rise to novel particle physics phenomena, while remaining safe from cosmological constraints.

However, one might be able to do better. While our underlying model of cosmology is in remarkable agreement with our ever-increasing stream of data, there are a number of critical areas where we are, no pun intended, in the dark. It may be that your new idea can help with some of these genuine cosmological conundrums. What should you look for? While the list is increasingly long these days, here are some common ideas.

  1. Got WIMPS? There are lots of connections between new particle physics (particularly beyond the standard model physics addressing the hierarchy problem) and dark matter. Perhaps you have a dark matter candidate in the theory. You’ll need to check to see if there is a long-lived (stable for all intents and purposes) particle with couplings of the appropriate strength (weak, or below) and mass in the right range. And it needn’t be a WIMP (Weakly Interacting Massive Particle). Maybe there’s an axion, or even a WIMPZilla.
  2. There are a number of hints that the highest energy cosmic rays may require exotic new physics for a complete understanding. Above a certain energy (the Greisen-Zatsepin-Kuzmin (GZK) cutoff), particles from cosmological distances shouldn’t reach us at all, because they would scatter off the CMB. This has led people to speculate that any ultra high-energy cosmic rays (UHECRs) may be a signature of new particle physics. Does your theory contain any particles or phenomena that could allow this to happen, and what spectrum of UHECRs should we expect? Some of those topological defects I mentioned above may be an example.
  3. You don’t, by any chance, have any unnaturally weakly coupled heavy scalars out there do you? Because we’re looking for an inflaton to do all the early universe’s heavy lifting. Your candidate should be able to quasi-exponentially expand the universe, flatten out its spatial hypersurfaces, causally connect seemingly unconnected regions of the microwave sky, generate all the matter content of later epochs (reheating) and imprint upon it the density perturbations necessary to seed our observed large scale structure.
  4. Come to think of it, there isn’t an alternative mechanism to inflation in your theory is there? It is fair to say that inflation is our best current idea about what happened in the early universe, but is not without its problems, and an attractive competitor would be very welcome. Good luck though – that list of requirements is pretty hard to satisfy.
  5. Now, generating the matter is one thing, but you’ll typically create an equal amount of antimatter, which will annihilate with matter, and leave very little left over to form all that lovely structure, never mind us. What you really need is a way to create an asymmetry between matter and antimatter (in fact baryons and antibaryons are what we care about) – a baryogenesis mechanism. Perhaps your inflaton candidate is exotic enough to generate this as part of reheating. Perhaps there are asymmetric decays of heavy particles in your theory, or maybe a way to make nonperturbative baryon number violating transitions work. You should get on that right away!
  6. The 800 pound gorilla in the room these days is, of course, cosmic acceleration. Do you address the cosmological constant problem? If not, is there a dark energy candidate in your model? This one would be wonderful, but don’t stress if you don’t have anything to add here – you’re in good company.
  7. Actually, since we’re now on to things that seem incredibly difficult to explain, your theory isn’t able to tell us why there are 3+1 (space+time) dimensions is it? That would be just great.

As you can see, modern cosmology has yielded a lot of hurdles for any up and coming particle theory to cross. It’s a tough new world out there. On the other hand, look at all the macroscopic problems your new microscopic theory may be able to address. The above lists certainly aren’t exhaustive – I have definitely missed out important constraints – but, more importantly, there are probably other crucial particle-physics connections out there with which to constrain theoretical ideas, just waiting to be discovered, perhaps by you!

CATEGORIZED UNDER: Science
  • http://theeternaluniverse.blogspot.com Joseph Smidt

    In the future, when I finally come up with “a new amazing theory” I’ll make sure to use this post as a checklist. :)

  • Paul Stankus

    Hi Mark –

    Here’s an example of a recently-discussed new possibility in particle physics that should bump up against cosmology constraints, but I’ve never heard that angle mentioned anywhere:

    Non-crazy people talk about the possibility that real, live micro-black holes will — soon! — be created in LHC collisions, _if_ there are in fact large extra dimesions (one approach to the hierarchy problem). If this were true, then wouldn’t the early universe have been thick with black holes whenever the temperature was above a TeV scale?

    Do these curled-up extra dimensions expand (or contract?) along with the overall expansion of the universe? In general, how would large extra dimensions affect the early universe? Can you give us a pointer to where this may have been discussed already?

    Thanks,

    Paul Stankus

  • http://www.doonboggle.blogspot.com DoonboggleFrog

    I have discovered a wonderfully elegant proof for my theory of everything that meets all those constraints and more. Unfortunately, I lack the space here, so I leave the derivation to the reader.

    (apologies to M. Fermat)

  • http://evolutionarydesign.blogspot.com/ island

    Einstein resolved all these problems long ago, only he didn’t know about the particle potential of the quantum vacuum. Simple. As. That.

    Try generating massive particles from the “dark” energy of his finite static model and see what happens to all of your problems, (stability, monopoles, the horizon and flatness problems, etc… etc… number 8), when this causes the universe to expand by way of vacuum rarefaction.

    Or at least tell him, (not me), why you think that this doesn’t work, because somebody already owes him an explanation that doesn’t require that I do a damned thing more than to point this out.

  • http://blogs.discovermagazine.com/cosmicvariance/mark/ Mark

    Please try not to kill threads with this stuff island.

  • http://evolutionarydesign.blogspot.com/ island

    Sorry Mark, I thought that Sean had made the post, and I thought that I was talking to he who should be able to recognize the point. Feel free to do whatever you want to fix your thread, Mark.

  • http://blogs.discovermagazine.com/cosmicvariance/mark/ Mark

    I don’t know what that means either island. As someone intimately acquainted with the Einstein static universe, I don’t see any relevance to the post at all, no matter who authored it. In any case, I’m not going to edit anything at this point, I just don’t want this thread hijacked as happens with many others.

  • Coin

    I shall now use this checklist to evaluate my revolutionary new theory: That the universe is made out of ice cream.

    1. Does your theory contain any new long-lived elementary particles?

    No, the ice cream melts.

    2. A related problem can arise if your theory contains long-lived particles that are too light

    We can define the ice cream to be very thick. Like say it is Wisconsin style ice cream.

    3. Does your theory contain any new topological defects, such as monopoles, domain walls or cosmic strings?

    Chocolate chips.

    Oh, dammit.

  • http://kea-monad.blogspot.com Kea

    Coin, you forgot to discuss ice cream flavours – is there Cookies’n Cream?

  • J

    I want to say this sort of problems might not even exist in the so called “new theory”. They are nothing more than false questions from theories nowadays.
    Another thing. I think we need much more cosmological data so that we will be able to prove or exclude those endless theoretical modles.

  • Illirikim

    Suppose you were constructing a theory of the *very earliest* universe. What would the list of desiderata look like then? I’m sure Sean would say that you ought to be able to explain the low entropy, and Mark himself would demand an explanation of how inflation got started…..any other examples?

  • Coin

    Coin, you forgot to discuss ice cream flavours – is there Cookies’n Cream?

    Ah-ha! Dark matter!

  • Elliot

    Mark,

    Thank you for a very cogent and interesting post. It is very illuminating to the novice to see what area has already been covered and thus can be safely bypassed by the existing body of knowledge. A “plankton” (hypothetical particle which forces the surface area of a domain to conform with the holographic principle by expansion/contraction based on total information content) could be an inflaton candidate. It also addresses dark energy in a very provocative way. But still extremely speculative. Still lots of work to do.

    Elliot

  • Torbjörn Larsson, OM

    Coin,

    Nice idea. And as the ice cream expands, it will cool to a preferable temperature.

    Btw, I used to like nuts in my ice cream. This thread makes me remember why I lost that particular taste.

  • Ryan

    Excellent post Mark. I agree with you, but I have questions:

    In light of your post I was wondering what you (or others) think about what Richard Lieu had to say in his astro-ph paper (arxiv:0705.2462v1): “LCDM cosmology: how much suppression of credible evidence, and does the model really lead its competitors, using all evidence?”

    The first sentence in the abstract is: “Astronomy [or cosmology in this context] can never be a hard core physics discipline, because the Universe offers no control experiment, i.e. with no independent checks it is bound to be highly ambiguous and degenerate.” His take on cosmology seems to be diametrically opposed to yours. For you, cosmology constrains possible particle theories, for Lieu “cosmology is not even astrophysics”, meaning cosmology can’t be trusted as much as normal physics since it’s based on assumptions/theories/phenomena that can’t be well explored because of … wait for it … cosmic variance.

    So, what do you think? Should we trust our current understanding of cosmology so much as to limit the possible particle theories we even consider?

  • noname

    I agree with the comments above: excellent post, Mark.

    As for Lieu’s paper, I have only skimmed it, but I think it is way off base. Cosmology has already made contributions to fundamental science: for instance, cosmology was historically the most effective way of constraining the number of neutrino species, and remains the best way to constrain neutrino masses today. In fact, unless I am mistaken (I don’t know much about this subject), cosmology is the only way of measuring the absolute scale of neutrino masses, since neutrino oscillation experiments only measure differences in the masses.

    Now, however you slice it, the supernova results are absolutely fundamental in nature: either dark energy exists, or GR is not valid on horizon scales. Either option represent a whole area of hitherto unknown physics, and constitutes a fundamental contribution to science. Thus, the idea that cosmology is not a science and has little to contribute to fundamental physics is ridiculous.

    Then there is the additional point that Mark has already addressed: any new elementary theory has a whole set of cosmological observations that it needs to address before it can be taken seriously. These observations present as hard a requirement on particle physics as any collider experiment does.

    I could go on and on about this, but I just find the Lieu post too ludicrous to be worth talking about. For instance, his “plausible alternative models” are Omega_m=1 models. The very fact that we can observe clusters of galaxies at redshifts of z>1 I believe is enough to rule out these models. So all in all, Lieu’s post reads a bit more like an incoherent rant than a serious objection.

  • http://deferentialgeometry.org/ Garrett

    Thank you Mark, I’ll keep these in mind.

  • alephnull

    If you are in two minds about folding your feeds, I’d like to vote against it.

  • Monte Davis

    …your theory isn’t able to tell us why there are 3+1 (space+time) dimensions, is it? That would be just great.

    Mm-yeah… I can’t help hearing Bill Lumbergh from “Office Space” there. (Intended?)

    This is a valuable companion to the assorted “how to recognize crackpot New Physics” essays around the Internet. Even some otherwise good science writers can get carried away with “we don’t even know the origin of mass” [or whatever], which can give laymen (and crackpots) the idea that physicists are flailing in total darkness at the foundations.

    Well, yeah, it’s dark down there — but these constraints, collectively, have a lot to tell us about where the yearned-for illumination isn’t going to come from.

  • http://evankeane.wordpress.com/ evankeane

    A great post – good work.
    :)

    Evan Keane

  • http://evankeane.wordpress.com/ evankeane

    shame about the ice cream theory …

  • Ryan

    noname,

    I agree, cosmology can contribute to fundamental science. I also think the tone of Lieu’s paper is abrasive and counterproductive. The only interesting point he brings up is the question of how much we can trust cosmology keeping in mind the problem that we can only observe one universe.

    I agree that the supernova result is fundamental, but isn’t it fraught with more uncertainties than data coming from fermilab for example? It seems to me that the dichotomy you set up: “either dark energy exists, or GR is not valid on horizon scales” is false. It could be that our understanding of the complexities of Type Ia supernova needs a little tweeking.

    Please understand that I don’t “believe” in some crazy alternative cosmology. I am aware that dark energy has more support than just the supernova data, I just think he brings up an interesting question:

    Should particle theorists put any effort into a theory that can’t sucessfully pass the “hurdles” of cosmology even though it may be very sucessful in other areas?

  • noname

    Hi Ryan.

    I think that if a particle physics theory fails one of the fundamental cosmology tests, then its automatically dead: the theory has made a prediction that was contradicted by data. If the theory is REALLY appealing for other reasons, and it passes the majority of tests, I could imagine working on it: maybe light element abundances are due to some out of equilibrium process or something. To be honest though, something like BBN is so fundamental and is so neatly explained today that I personally would doubt that radically different approaches will be as succesful as the BBN picture. In fact, I think of it this way: if you come up with a great new theory that neatly explains dark matter, dark energy, and inflation, that would incredible! If it also predicts protons are unstable, not so good. However, maybe you think that the successes of the theory are large enough that it merits still working on it, since maybe you’ll find a tweak that can make protons stable again. In my mind, that is pretty analogous to what I would think about a theory that has many successes, but ruins BBN or the CMB.

  • Matt

    I have three physics questions that I haven’t been able to clear up. I hope that one purpose of this blog is educational, and at least one of my questions seems tangentially related to the posting, so I figured I’d give it a try and ask.

    What is the status (if any) on entropic (not anthropic) arguments for the current value of the cosmological constant? De Sitter space appears to have a finite entropy that depends on the value of the cosmological constant, S~1/Lambda, and although we are not in pure de Sitter today, there is a cosmological constant, as well as lots of matter and radiation. Have people written papers addressing the question of whether the cosmological constant is somehow stabilized by the second law of thermodynamics, in that its current value is somehow related to maximizing the entropy or the rate at which entropy grows with time? Naively, one might think that this could be related to the cosmic coincidence problem, since presumably the cosmological constant, matter, and radiation all contribute entropy to the universe, and when they are all of order the same density their contributions are in some kind of competition.

    And now for some really dumb questions that are less related to the current discussion, but, what the hay. If you’re too ashamed to ask a question, then you remain always ignorant, and I couldn’t find a better place to ask. Maybe there are others here who are equally perplexed, and this might be of service to them.

    I’m reasonably familiar with confidence intervals in probability theory and statistics, but I must confess that I am still unclear about what is meant when people say that they have experimental evidence for the existence of some particle “at three sigma.” What precisely do they mean by this statement? That they can see the resonance peak of the particle out to three standard deviations, or that they’ve repeated the experiments many times and the theoretical prediction lies within some number of standard deviations from the mean experimental result, or what?

    Also, I increasingly see in papers the statement that so-and-so quantity is “parametrically bigger” or “parameterically smaller” than something else, usually when the difference between the quantitites is very large and usually with an exclamation point to indicate happy excitement about the fact. Do they just mean the prosaic statement that you have some parameter in your theory, alpha, (as opposed to a fixed number, like sqrt(2)), that is small, and alpha determines the relative magnitude of the two quantities being compared? Am I missing something? (Clearly.)

    Thanks!

  • http://nige.wordpress.com/ nigel

    My research in a simple causal mechanism of gauge boson exchange addresses all these problems successfully: http://quantumfieldtheory.org/1.pdf

    The problem I’ve found is that once you hit on something that does agree with nature, how far you should you then go in applying it yourself? Particularly if you have family and work commitments and have limited time? Newton allegedly spend 22 years (1665-87) working out the consequences of his initial idea. He published finally after a priority dispute with Hooke over the inverse-square law.

    If you get bigoted responses from egotists and “crackpot” abuse from the mainstream, probably the best thing to do is to go ahead and apply the basic concepts as far as you can. Boltzmann’s problems were getting depressed by other people’s ignorant reactions. If certain people aren’t interested in new ideas or are prejudiced against you, it’s only your problem if you are dependent on them. If those people are just a nuisance to everyone, then it’s better not to waste too much time in arguments (just enough to prove you made some effort and to document the ignorant hostility or abuse you receive in return). Not making any effort to communicate information is just as dangerous as making too much effort (hype) to do so, because potential opportunities for fruitful discussions will be lost. The first priority is applying the science.

  • http://arunsmusings.blogspot.com Arun

    When one applies the cosmology checklist against string theory, what interesting constrainsts emerge? (perhaps a topic for another post?)

  • http://blogs.discovermagazine.com/cosmicvariance/mark/ Mark

    It’s a good questions Arun. I’m giving a set of lectures related to this later in the year and will probably write a proper post on it then.

  • http://countiblis.blogspot.com Count Iblis

    Often, if some simple idea fails you can revive it by including interactions. E.g. a straighforward explanation for the result of the PVLAS experiment other than experimental error is problematic because of null results from other experiments and observational constrants.

    But all of the constraints can be evaded by considering more complicated models. As a result, a steady stream of papers has been appearing on this topic for some time now :)

  • Thomas D

    Actually the PVLAS signal just disappeared:

    http://arxiv.org/abs/0706.3419

    so the models may have to become even more complicated to explain the disappearance of the effect that they were invented for.

  • Hag

    Should there be any constraint on the time of electroweak symmetry breaking? I ask this because I have heard some criticisms of the inflationary model in this respect, coming mainly from Penrose. The criticisms rest in the fact that the SU(2)xU(1) gauge symmetry is broken AFTER the period of inflation. So any non-smoothness (with respect to the gauge connection in the SU(2)xU(1) bundle over space-time) in the specific way it was broken over space-time (resultant from the randomness of the breaking) wouldn’t have had time to be smoothed out. So we could have non-equivalent notions of photons and W and Z bosons.

    To be more explicit, it would not explain why we see the same kind of photons coming from opposite sides of the Universe (since they would not have had causal contact after the inflationary period).

  • http://tyrannogenius.blogspot.com Neil B.

    I don’t have the idea, but someone should address the inherent chirality in the universe, as shown by the classic Co60 experiments in the 50s. That ties in loosely with the question of neutrino masses, since neutrinos have that chirality preference (and, I still don’t really get the explanations earlier about the two different kinds of spin/chirality and their different transformation properties: it seems to me, angular momentum is what it is.)

  • Van

    Neil,
    Only the left-handed states participate in the weak-interactions. Thus, the weak interactions violate parity. Other than this, there is no fundamental difference between left-handed and right handed chirality.

  • Paul Stankus

    Greetings,

    Before it gets buried too deeply I’d like to re-pose the question asked up in comment #2: What effect would large extra dimensions have on cosmology? and, so, what constraints can cosmology put on the existence of large extra dimensions?

    I asked this question just yesterday after a seminar on grand unification through extra dimensions, and was told a number of strange things; for example, that if large extra dimensions exist then post-inflation re-heating must have been at a very low temperature, not above a GeV (!). So it seems like there must be some strong connections there. Can anyone enlighten me further?

    Thanks,

    Paul

  • http://nige.wordpress.com/ nigel

    The chirality issue is addressed in the Standard Model just by the SU(2) charge, isospin, being zero for all right-handed Weyl spinors (see the table here. The right-handed particles simply can’t interact with the massive weak gauge bosons, although they still see the same electromagnetic force. Below electroweak unification energy, the weak gauge bosons gain mass, and these massive weak gauge bosons can’t interact with right-handed particles (The loss of the weak isospin charge for right-handed particles is compensated for by their increased weak hypercharges.)

    What’s interesting is that there are severe issues with the U(1) electromagnetic and weak hypercharge gauge field. Sheldon Glashow and Julian Schwinger in 1956 tried to use SU(2) to unify electromagnetism and the weak interaction by having the two charged vector bosons the mediators of weak interactions and the neutral vector boson the mediator of electromagnetism. I.e., they tried to use SU(2) for electroweak unification! Glashow comments on his 1979 Nobel lecture:

    “Things had to be arranged so that the charged current, but not the neutral (electromagnetic) current, would violate parity and strangeness. Such a theory is technically possible to construct, but it is both ugly and experimentally false [H. Georgi and S. L. Glashow, Physical Review Letters, 28, 1494 (1972)]. We know now that neutral currents do exist and that the electroweak gauge group must be larger than SU(2).”

    SU(2)xU(1) gives four vector bosons, two charged and two neutral. However, it implies that all leptons are singlets (in fact they are only formed in lepton-antilepton pairs) and it doesn’t include a gravity vector boson which you’d expect to be found. An alternative would be a second SU(2) group, i.e., SU(2)xSU(2), which gives 6 vector bosons, i.e., the usual 3 weak gauge bosons and another 3 which can be always massless and thus long-range forces. Another option would be that the Higgs mechanism is wrong, and the correct electroweak group is just SU(2), in which some of the 3 massless gauge bosons (2 charged, 1 neutral) acquire mass and interact with left-handed particles.

    When you examine well known anomalies in electromagnetism carefully, it is easy to model electric fields as products of positive and negative charged exchange radiation (usual objections to massless charged bosons propagating are erased in the case of exchange radiation due to cancellation of the curls of the magnetic fields of electrically charged gauge bosons passing through one another in equilibrium), while gravity is electrically uncharged massless gauge bosons. This tells us that electromagnetism is ~10^40 times gravity in a simple way by taking a Brownian motion like a path integral (electric fields add up like diffusion if there are equal positiev and negative charges scattered around) for the ~10^80 charges in the observable universe.

  • Van

    SU(2) x U(1) implies that all leptons are singlets? I don’t think so. The left-handed charged and neutral leptons are a doublets under the electroweak symmetry, just the same as the up and down-type quarks.

  • http://tyrannogenius.blogspot.com Neil B.

    “well known anomalies in electromagnetism” – huh? I didn’t think there were anomalies in “electromagnetism” as such. Do you mean odd properties of particles, that related to EM, or do you mean the problem issues like the field energy being more than particle mass below a certain radius (and therefore not just a “quantum” issue!)

    About chirality: it’s the discussion from Diogenes (#24) in “MiniBooNE Neutrino Result – Guest Blog from Heather Ray” that I didn’t get, but thanks for other background info:

    You are completely correct, and this effect is included in the theory of neutrino masses as appears in standard texts. The projection of the neutrino spin along its direction of motion is it’s “helicity”. There is another measure of the “handedness” of a relativistic spin one half-particle called “chirality” that is easily defined in terms of the matrices appearing in the Dirac equation, and this measure does NOT depend on the reference frame in which you observe the particle. For massless particles these measures of “handedness” agree, but for massive particles they can disagree for exactly the reason you describe, ie. by moving past the particle you can reverse it’s direction of motion, hence it’s apparent helicity.

    I don’t get the difference here, and which corresponds to plain “angular momentum”?

  • Van

    Neil,
    The helicity is the direction of the particle’s spin as it moves. You can determine the helicity by using the right hand rule: point your thumb in the direction of motion and the direction (clockwise or counterclockwise) your fingers curl around is the helicity. The chirality is right-handed or left-handed depending on which hand you need to use for this to work. For a massless particle which travels at the speed of light, you can never move past it and reverse the direction of motion and thus the helicity. However, for a massive particle you can do this so massive particles have both helicities. Massless particles such as the photon or like the neutrino in the old days when it was thought to be massless will only have one helicity or the other. The Standard Model neutrino (actually anti-neutrinos) are massless and purely left-handed. The discovery that neutrinos have mass means that there exist a right-handed helicity for the neutrino as well.

  • Van

    Nigel,
    Regarding your ideas regarding replacing the U(1)_Y factor gauge group with an SU(2)_R group, perhaps you should have a look at Pati-Salam and/or left-right symmetric models: http://en.wikipedia.org/wiki/Pati-Salam_model

  • http://nige.wordpress.com/ nigel

    Hi Van, glad you got my point. Thank you very much for referring to the Pati-Salam modem, SU(4) x SU(2)_L x SU(2)_R. Yes I am interested in something that looks nearly identical, like SU(3) x SU(2)_L x SU(2)_R. However, SU(4) x SU(2)_L x SU(2)_R is different in many ways. They chose that not due to experimental evidence or unique quantitative predictions it can make, but because it can undergo spontaneous symmetry breaking to produce exactly the existing Standard Model, so that the Higgs field at low energy causes SU(4) x SU(2)_L x SU(2)_R to produce SU(3)xSU(2)xU(1). At high energy where the symmetry is unbroken, it is a grand unification theory.

    This approach has many problems both in methodology and in checking it. 1) It assumes the Standard Model is totally correct at low energies and it assumes that forces do unify at very high energy. 2) It doesn’t make immediate predictions or post-dictions of the strength of gravity, cosmological effects, etc., that can validate the approach. 3) It doesn’t actually seem to make any long-term checkable predictions that are useful. 4) It doesn’t seem to make things simpler with regard to the Higgs field or the masses of different fundamental particles, which is the cause of most of the adjustable parameters in the existing Standard Model. 5) It doesn’t seem to help resolve existing problems in physics or to point in the direction of simple mechanisms to improve understanding. 6) It doesn’t get rid of U(1) at low energy, since U(1) emerges at low energy as a result of the symmetry breaking they are assuming. 7) It’s a theory built on speculation instead of on empirial observations.

    SU(2) does have several advantages in describing leptons as doublets: pair production produces lepton-antilepton pairs. A conversion of 100% of positrons into upquarks and 50% of electrons into downquarks in the big bang would explain the alleged lack of anti-matter in the universe: it’s locked up by quark confinement in nucleons (the universe is mainly hydrogen, an electron, downqrark, and two upquarks). One simple mechanism based entirely on mainstream QFT is that the electric field of the core of a lepton is shielded by the polarization of pairs of virtual fermions around it. The virtual fermion pair production is, Schwinger showed, a result of the electric field of the electron core which extends out to about 1 fm radius where the electric field is above the threshold of 1.3*10^18 v/m required for pair-production. If at high energy in the big bang (very early times), N electrons were crowded together in a small space (against the Pauli exclusion principle), the polarization of the vacuum would be stronger, so the shielding factor due to the vacuum would be N times bigger. Thus, 3 electrons crowded together in a tiny space would still only give an overall electric charge of e; the contribution from each electron would be e/3 due to the extra shielding by the stronger polarized vacuum. This is just a simple heuristic mechanism for fractional charges. The energy conservation issue then comes to the fore: what happens to the 2/3rds of the electric charge energy (that is now being shielded by the stronger, shared vacuum polarization around the triplet)? Clearly, that energy is stopped at very short distances by the vacuum and used to produce loops of virtual particles which mediate short-range interactions. To avoid violating the Pauli exclusion principle (which would prevent a triplet of three identical quarks, since there are only two spin states available), colour charge must appear. This suggests that ‘unification’ of all forces doesn’t occur at very high energy: the colour charge is powered by short-range vacuum loop effects and decreases towards zero when you are close enough to the particle core that there is no room for the vacuum to polarize (i.e. no space for virtual fermion pairs to move apart along the lines of the radial electric field).

  • Van

    Nigel,
    It sounds to me like you are looking for a left-right symmetric model:

    http://en.wikipedia.org/wiki/Left-right_symmetry

    Regarding the rest of your post, it really doesn’t make much sense to me.

  • http://blogs.discovermagazine.com/cosmicvariance/mark/ Mark

    I’m rushing to get ready to leave on a trip so can only give a very brief answer to Paul (#33). One way in which extra dimensions are constrained by cosmology is that radiation from our brane into the extra dimensions provides another way for the 4d universe to cool (apart from the usual redshifting of photons due to the expansion). Since we have an extremely good handle on the rate of cooling as far back as nucleosynthesis, it better be that any modified cooling rate is only efficient above a few MeV or so. This puts bounds on the size and number of extra dimensions.

    There are many other things, and I’ll try to post more when I get a little time.

  • spaceman

    Lieu’s paper is a semi-professional version of the a dime a dozen anti-Big Bang tirades which, unfortunately, seem quite common. The paper stops short of the typical weak “what-if strategy” of these alternative cosmologies e.g. what if the dominant force in determining how matter is distributed universally is electrical rather than gravitational, what if redshift does not indicate distance, what if…what if pigs could fly and write sonnets. However, it does raise the following question: Why is there so much aversion and hostility toward Big Bang cosmology, as only a third of US citizens believe in the Big Bang and the internet is full to the brim with the Big Bang never happened blogs?

  • Jim Graber

    Mark, I wonder if you would like to comment on these two recent weak gravitational lensing observational results, one of which seems in line with standard dark matter expectations, and one of which seems to raise questions?

    http://www.arxiv.org/PS_cache/arxiv/pdf/0705/0705.2171v1.pdf

    http://www.arxiv.org/PS_cache/arxiv/pdf/0706/0706.3048v1.pdf

  • http://backreaction.blogspot.com/ Stefan

    Thanks for this great overview!

    Concerning the GZK cutoff: it seems to be there as predicted – seen in recent data both by HiRes (arXiv:astro-ph/0703099v1) and AUGER (arXiv:0706.2096, see Fig. 6). There will probably be more news next week from the International Cosmic Ray Conference ICRC’07 in Mexico. In the meantime, Bee at backreaction explains the GZK cutoff in detail.

    It looks as if “boring” nuclear physics rules – data are consistent with interactions of cosmic ray protons with CMB photons as expected from boost invariance.

  • Pingback: Seed's Daily Zeitgeist: 6/27/2007 » Chymistry

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Cosmic Variance

Random samplings from a universe of ideas.

About Mark Trodden

Mark Trodden holds the Fay R. and Eugene L. Langberg Endowed Chair in Physics and is co-director of the Center for Particle Cosmology at the University of Pennsylvania. He is a theoretical physicist working on particle physics and gravity— in particular on the roles they play in the evolution and structure of the universe. When asked for a short phrase to describe his research area, he says he is a particle cosmologist.

ADVERTISEMENT

See More

ADVERTISEMENT
Collapse bottom bar
+

Login to your Account

X
E-mail address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it e-mailed to you.

Not Registered Yet?

Register now for FREE. Registration only takes a few minutes to complete. Register now »