The LHC, the Tevatron, and the Higgs Boson

By John Conway | July 27, 2011 2:37 pm

A few weeks back I wrote about the remarkable milestones passed by the Tevatron and LHC, and prognosticated that if there was ever a time when new discoveries could come out rapidly, this was it, especially for the LHC experiments analyzing a data sample 30 times larger than the previous one.

The result? Nature is being coy – in basically every new particle search for new particles and phenomena conducted by the CMS and ATLAS experiments at the LHC, we see naught but eerie agreement with the predictions for ordinary standard model background.

A huge raft of results has been presented at two large international conferences: the annual European Physical Society meeting on high energy physics in Grenoble, France, and the Particles and Nuclei International Conference (PANIC11) at MIT in Cambridge, MA. I presented the CMS results on the searches for the Higgs boson at the latter on Tuesday…more on that below.

There is a trove of material available online for these two conferences and at the ATLAS and CMS public physics result web sites. Let’s just look at a couple examples, though.

The theme of the PANIC conference was the centennial of Rutherford’s discovery of the nucleus, and what better way to celebrate it than to essentially perform his experiment with a million times more energy, and peer inside quarks to see if…well, to see if there is an inside, to see if they have substructure. Naturally, to do the experiment we smash quarks together and see if we see any hint that there is something smaller inside, which would manifest itself as an excess of particle jets coming out sideways to the beam, much as Rutherford’s students Geiger and Marsden saw alpha particles deflected from their gold foil at angles far too large to explain.

But in the first graph here all we see is a smooth spectrum, agreeing exceedingly well with the predictions, extending out to huge energies…no bumps, no excess in the tails, and no excess of jets coming out sideways. In one fell swoop we’ve extended the limit on the size of quarks down by a factor of three or four. As far as we can tell, quarks are pointlike. It’s the subject of the first paper from CMS using 1 fb-1.

Another huge effort went into searching for evidence that there may be supersymmetric partners for the known fermions and bosons – a search that has been underway for the past three decades. If supersymmetry is present in nature it would help solve a theoretical riddle as to why the calculated mass of the much-touted Higgs boson can remain stable in the face of enormous quantum mechanical correction factors. Supersymmertry would provide a mechanism to largely cancel these corrections.

Supersymmetry could show up in a variety of ways at the Tevatron and LHC, but with three and a half times more energy than the Tevatron, the LHC has a huge advantage in this search, and already with last year’s sample of data the LHC experiments blew past all of the Tevatron exclusion limits.

And now with thirty times more data, in search channel after search channel, the story from CMS and ATLAS is the same: no hint of supersymmetry is evident anywhere. You shouldn‘t take this statement to mean that supersymmetry cannot exist. All we can say at this point is that if it does exist, in a generic, simple version of supersymmetry called mSUGRA, the masses of the partners of the quarks appear to be very heavy, over 1 TeV. The heavier they are, the less effective their power is to cancel the Higgs boson corrections. And theorists are very inventive, and are thinking about supersymmetry models that might not show up so easily in our experiments [see the comment below by Matt Strassler].

And what of the Higgs boson? Here, I must say, the story is becoming very interesting. The LHC experiments have a big advantage over the Tevatron, and the bottom line is that for high mass Higgs bosons, in the mass range above 150 GeV or so, the LHC has totally eclipsed the Tevatron, basically ruling out a Higgs boson with mass anywhere from about 150 GeV to 450 GeV. This is directly a result of the huge increase in the size of the data sample, and combining a half dozen search channels. Both CMS and ATLAS obtain similar results in this mass range, and despite a slight excess around 230 GeV in the ATLAS experiment, I think I can say with confidence that the Higgs boson will not be discovered in that regime.

But I personally never thought that this was likely. The sum total of the world’s data on precise measurements of the W and Z boson masses and properties, and the mass of the top quark, when taken together, tend to suggest a very light Higgs boson, much nearer 100 GeV. In fact the best predicted value for the Higgs boson mass is a good deal less than 80 GeV, but the LEP 2 experiments excluded a standard model Higgs boson with mass less than 114.4 GeV. This defines the low end of the present search window, which now extends to 150 GeV or so, and the precision data favor the low end of this range.

In a nutshell, what is happening is that the Tevatron experiments are bringing down a curtain on the Higgs boson, but the curtain is lower on the low mass end. The LHC is bringing down the curtain, too, but from the high mass end. So the two machines are in a race to achieve sensitivity to a standard model Higgs boson in the low mass range near 120 GeV.

The Tevatron experiments will collect their last data in two months, but the LHC experiments will keep collecting data, probably quadrupling the present sample by the end of the year. So time is a factor here as well, and the Tevatron experiments now truly have one last chance to cover the interesting low mass range.

But what to we mean by “cover” the range? If there is simply no Higgs boson to be discovered, then my prediction is that the Tevatron experiments can exclude it with 95% confidence up to a mass of around 120 GeV with the final data sample. If the Higgs boson truly is in that mass range, however, the experiments should not be able to exclude it!

The LHC will continue to press on, the experimenters will continue to improve and refine the analyses, and by he end of the year, I predict, if all goes well at the LHC we will either exclude the Higgs’ existence all the way down to the Tevatron limit or begin to see an excess.

In fact, the LHC data from both CMS and ATLAS are showing an excess in a broad range at low masses. Now this could be a systematic underestimate of the backgrounds, a statistical fluctuation in the observed spectra, or it might, just might be due to the presence of a low mass Higgs boson. It is not surprisin g that the excess is n a broad mass range, because one of the most sensitive channels, in which the Higgs decays to WW, has little or no mass resolution. This excess is why the press has recently picked up on this excitement – it’s quite in line with what one would expect to see if there is a Higgs boson signal just beginning to show itself.

To truly discover the Higgs boson will take a LOT more data, which we will get from the LHC in 2012. Now, we usually reserve the word “discover” for the situation where we have a “5 sigma” excess, by which we mean that there is less than one chance in over 3 million that a statistical fluctuation of the background alone could give us what we observe, or more. This is a stringent criterion, and not at all easy to establish, taking into account all the various experimental uncertainties.

If the Higgs boson mass is near 120 GeV, can we get a 5 sigma discovery by the end of 2012? It may take combining the data from the LHC experiments with the data from the Tevatron, a radical concept at present I have to say, but technically possible.

So despite the coyness of Mother Nature as to the nature of any new physics beyond the standard model, it’s nevertheless a very exciting time in the field, and who knows: maybe if we hold our mouths just right, cock our heads and squint just so, we might soon see something we hadn’t quite thought of before.

  • John

    Characteristics of the Higgs boson:

    Extremely heavy, purple color, soft to the touch, shines mildly in the dark, mixes easily with most liquids – solids with the aid of heat, promotes atomic changes in other elements – combined or not, very sweet taste, have been created quite a few times for thousands of years with just a few dollars, not billions of dollars.

  • Gerald W.


    Be sure to check out the alt-text

  • John

    Believe it or not, this is the reason for all the fanfare:

  • Beep

    To sound bite your piece – if the Higgs exists, it is between 114.4 and ~150 GeV?
    Also, at what point would limits on the mass of super-symmetric particles mean that they cannot stablise the Higgs.

  • John

    Beep: second question first. I asked Jesse Thaler from MIT yesterday “what if the squarks (the supersymmetric partners of the quarks) are at the 2 TeV scale?” The answer to this question is in the form of a “fine tuning factor” – how “just so” does the Higgs mass parameter in the theory have to be to be stable against the higher-order corrections. In the standard model, the fine tuning is huge, 10 to the 100 or more. If squarks were light, the fine tuning would be small, less than about 10. But by 2 TeV, the fine tuning is about a thousand, enough to worry at least some theorists. I am agnostic on the question, if we see supersymmetry, great, if not, then I hope we see something else.

    But yes, if there is a standard-model-like Higgs boson, it will almost certainly be found in the range 114.5-150 GeV, and most likely at the low end of that range.

    Another note: even if we find what looks like a SM Higgs at the LHC/Tevatron, we won’t know for quite some time if it is the only Higgs boson, or the light one in the supersymmetric theory…more on that later. But the work will have just begun, for sure!

  • Jimbo

    As John Ellis pointed out in a review last July, the existing data exhibit a sharp delta-Chi^2 minimum near 120 Gev:
    This is in accord w/my prediction of Mh = 2pi(alpha)^8 Mp = 122.8 Gev, where Mp is the reduced Planck mass & alpha is the fine structure constant.

  • Anadish Kumar Pal

    However, the particles responsible for gravitation are not hidden inside hadrons to be smashed out, they are everywhere, aren’t they? It’s a total misconception on the part of ‘modern’ scientists to use atom smashers to look for a gravity producing particle. It makes them the ‘modern’ equivalent of the Greek atomists who would grind materials to dust to find the atoms. I have tried to give a cursory chronology of the developments in my research efforts; don’t be too much disappointed by the absence of details or by an absence of neatly painted instruments (and then, the site really is quite like a blog), the details shall be published independently with the publication of my US patent application by the US Patent Office. Although, I feel, even then there would be so much more to explore after our world view change.

  • Blake Stacey

    Part of me hopes that the experimentalists find no squarks, because then we’ll have to imagine a new set of hypothetical particles, and the name we invent for them is sure to be even sillier. :-)

    I’d be remiss if I didn’t point out Urs Schreiber’s blog post on what the concept of SUSY means outside the context of phenomenology.

  • Matt Strassler

    Hi John — How are you? Great work coming out of CMS!! BUT I have a bone to pick with you. In your post you say “no hint of supersymmetry is evident anywhere… we can say at this point is that if it does exist, the masses of the partners of the quarks must be very heavy, over 1 TeV. ” This seems to be a widely held view in the community, but I personally think it is a significant overstatement. What you say is correct **only if** the decays of the superparticles go directly to the lightest supersymmetric particle with the full missing energy signal of a minimal-supergravity-type model. In a paper just completed, arXiv:1107.5055, Mariangela Lisanti, Philip Schuster, Natalia Toro and I point out that large classes of perfectly reasonable supersymmetry models can easily disappear underneath the Standard Model missing-energy background, and go undetected by current ATLAS and CMS searches. (We suggest alternative search strategies, and go through one in detail.) And there are other classes of models, with squeezed spectra, that are also not excluded, as has been emphasized by Jay Wacker and his group. So no, I don’t think we know that the particles are heavier than 1 TeV — they could be far less than that. Honestly, it makes me very nervous to hear top-quality senior experimentalists such as yourself stating to the public that a thorough search for superpartners below 1 TeV has been performed, when in my view and that of other experienced theorists this is very far from true. I personally think more cautious statements would be in the community’s best interest. Happy to discuss this further by phone if you would like. Best — Matt Strassler []

  • John

    Thanks Matt, of course you are right. I have added some weasel words and a whole weasel sentence to the post to cover such possibilities… With hundreds of parameters at one’s disposal (and more if R-parity is violated), I am sure there are plenty of ways to evade the experimental limits. Hopefully we are doing the right experiment!

    My basic point, though, stands: we have seen no evidence of supersymmetry or anything else beyond the standard model yet. I wouldn’t ‘a’ thunk it…

  • Bjoern

    @Anadish Kumar Pal:

    However, the particles responsible for gravitation are not hidden inside hadrons to be smashed out, they are everywhere, aren’t they?

    Since (1) Higgs bosons are not responsible for gravitation, and (2) no one claims that Higgs bosons are “hidden inside hadrons to be smashed out” (and the same two points are true also for supersymmetric partner particles), I don’t see the relevance of your comment to the article here…

  • Pingback: Two interesting posts recommended | Of Particular Significance()

  • Jeh-Tween Gong

    Have you heard about “Prequark”?

  • Mark

    ” If squarks were light, the fine tuning would be small, less than about 10. But by 2 TeV, the fine tuning is about a thousand, enough to worry at least some theorists.”
    Dear John, while naively this seems to be the case, I would like to point out that in the top-down motivated approaches to SUSY breaking, e.g. from string compactifications with stabilized moduli, heavy scalars do not automatically lead to the excessive fine-tuning: . In fact, unless one has sequestering, which is highly non-generic in string theory, the scalars as well as trilinear couplings are almost as heavy as the gravitino, while the gauginos are typically much lighter because the moduli are usually stabilized near supersymmetric points and the tree-level gauge kinetic function is independent of the field that dominates SUSY breaking. This results in a strongly split sparticle spectrum: light gauginos O(100 GeV-1TeV) and heavy scalars O(10-100)TeV, while at the same time the naive little hierarchy problem has a rather natural solution (see the paper above).

  • Jeh-Tween Gong

    @John: … Naturally, to do the experiment we smash quarks together and see if we see any hint that there is something smaller inside, which would manifest itself as an excess of particle jets coming out sideways to the beam, …

    @John: As far as we can tell, quarks are pointlike.

    With the two statements above, John and perhaps many other physicists concluded that pointlike particle has no internal structure.

    Why must a point be without any internal structure?

    The article “The source of the ‘Spontaneous Symmetry Breaking’, part 1 to 6” at [] shows that a geometric point can have internal structure.

  • Greg

    @Mark #14,

    What values of the masses of the Standard Model particles do you calculate from your model? Just curious.

  • Peter Woit


    In the paper you link to, the claim is that these models predict gluino masses of 400 GeV to 1 TeV. So, aren’t they already ruled out by the LHC?

  • Mark

    @Greg, the masses of the SM particles are used as input to constrain the sparticle spectrum and the RG evolution, e.g. only the 3rd generation trilinear couplings are used in the running since the corresponding Yukawas are largish. Computing the Yukawas is a far more complicated task compared to deriving the GUT-scale boundary conditions for the soft breaking parameters. If you fix the Yukawas, the tree-level SM spectrum would depend on the Higgs vev, which is fixed by experiment but from the top-down perspective is ultimately related to the gravitino mass. The gravitino mass is partially constrained to be O(10-100 TeV), but not completely since there are certain discrete choices of parameters, e.g. discrete Wilson line winding numbers, as one scans over different superselection sectors.

  • Mark

    @ Peter Woit: No, as far as I understand, only the simplest models with mSUGRA-type boundary conditions are now being ruled out.

  • JiJi Fan

    Just as Matt pointed out, the statement that LHC results imply no SUSY below 1 TeV is incorrect. It is known in the community that there have been at least several scenarios evading the present bounds mainly from the jets+MET searches. Besides Matt’s most recent paper with his collaborators, I just want to mention some other existing works: R-parity violating SUSY where one searching strategy, reconstructing triple jet resonance, was studied in Rouven Essig’s Ph. D thesis and is being used now by the Rutgers experimental group to set limits on this scenario, e.g., 1105.2815 and; MSSM with a squeezed spectrum studied by Alwall, Le, Lisanti and Wacker, e.g., 0809.3264; and my work with Matt Reece and Josh Ruderman which we dubbed “stealth SUSY”, , which is a class of natural supersymmetric models with very little MET and could be searched for by various strategies involving no MET including reconstructing multi-jet and photon+jets resonance.

  • anon.

    Peter’s question:

    In the paper you link to, the claim is that these models predict gluino masses of 400 GeV to 1 TeV. So, aren’t they already ruled out by the LHC?

    deserves a more careful response, I think. First, the direct limits on gluinos if the squarks are much heavier are still not quite at 1 TeV. (The ATLAS result at EPS, e.g., looks like it excludes gluinos below about 800 GeV.) But it’s true that this naively excludes most of that range.

    The catch, though, is that the ATLAS exclusion assumes the gluino decays through an off-shell first or second generation squark. In many models where the squarks are heavier, this decay goes dominantly through third generation squarks. In particular, one can have gluino -> t tbar chi0 dominate. This has two effects: first, the top masses eat up much of the phase space, so the neutralino is softer and carries away less missing energy. Second, the jets in the event come from top decays, and have less energy than light quark jets in the scenario that’s excluded.

    I don’t know any official numbers for what the limits are on such a scenario, but I’ve simulated several CMS and ATLAS analyses (with phenomenologist-level tools, not the experiments’ detector simulations) and the result is that they’re weaker by hundreds of GeV. So the low end of that range — gluinos between 400 and 500 GeV — is almost certainly excluded, but the bulk of it is not.

  • marianne

    Guys, it is really not that big a deal to find regions in SUSY parameter space to avoid current LHC constraints. When you have more than a 100 free parameters this is quite easy.

    The major theme of the blog post is that new physics doesn’t seem to jump out in the current data. It’s really pathetic to watch people on here shamelessly promoting their latest silly model as an explanation. There are an infinite number of ways to do this and is not a big deal… so, get over yourselves.

  • Mark

    @marianne, FIY, I’m not promoting my “latest silly model”. The paper I was pointing out to is not mine, nor is my name mentioned in the acknowledgments. However, I wanted to correct the statement of Jesse Thaler, conveyed by John, that heavy squarks => severe fine tuning. BTW, if you only follow the bottom-up approach you indeed have O(100) free parameters in the soft breaking largangian but from top-down this “argument” that one has so much freedom is complete red herring as the number of effective free parameters is drastically reduced once you stabilize all the moduli.

  • Fireworks below 1TeV

    Thanks for the wonderful post John.

    Now to Matt Strassler’s response:

    Wow! We’ve had 20 years of theoretical phenomenologists making claims about susy particles being found just around the corner. And now, after the incredibly hard work of experimentalists to shine some light on the situation, they find no evidence for low energy susy and are releasing this information to the world. But at the same time, experimentalists are aware that susy may be at a higher energy scale.

    So its great work by the experimentalists, and shocking that theorists are now getting angry at them and attacking them with their 1000th low energy susy model that manages to avoid the current constraints. Of course it is true that not every low energy susy model is ruled out, but it is totally reasonable, and totally useful, for the experimentalists to report to the world the basic picture that seems to be emerging — namely that the standard model keeps working, and the 20 years of claims by theoretical phenomenologists of fireworks below 1TeV is not emerging. This analysis by the experimentalists is totally useful, and certainly more useful than the 1000th random guess regarding complicated R-parity violating models etc for which there is no observational evidence, and increasingly limited theoretical motivation as well.

  • marianne

    @Mark, in that case my comments are not directed at you. It is just irritating to see other people on here attack the author (John) of the post because they have a ” new silly model” as an example of SUSY below a TeV that can remain hidden in the current data.

    In the post, John was simply making broad statements about the fact that there is no compelling evidence for new physics. It is “obvious” to the rest of us (including John) that one can always come up with classes of models to avoid current data constraints. John was not “incorrect” he was just a bit “imprecise” in language because he is writing a blog for a wide audience that does not care about specific details. People here seem to be using this “technicality” (the slightly imprecise language used by John) as an excuse to promote their model. It’s just irritating!

  • Mark

    @anon 21, thank you very much for such a detailed response to PW’s question! Really appreciated!

  • OhDear

    Fireworks below 1TeV,

    An inexplicably defensive response. Nobody is saying it is not useful or reasonable that the experimentalists are reporting what they have found. Nobody (except you) is getting “angry”. But interpreting these results in terms of some underlying theory is a subtle business, and broad brush claims like “susy is ruled out below 1 TeV” are simply false.

    I also think John’s post was excellent by the way. For certain points to be clarified in the comment section is perfectly reasonable, and there is no reason to get upset.

  • Peter Woit


    Seconding Mark’s thanks, my own thanks for the detailed answer.

  • John

    In order to keep people in this blog from getting confused I should say that I am the John who came with the first response, not the one who posted the excellent article in the beginning of this page. That’s the reason why I am changing my name to John B.

    I am not a physicist but I love physics as a whole. I understand that most, if not all the people posting here are scientists and might not like what I am about to present. However I do this in order to keep the record straight and to let those who ignore the past, for one reason or another, know that knowledge of particle physics is nothing new but it is in fact a very old science, only that it was handled differently – I know that I am going to be criticized for this – I see it coming :) but the truth should not be hidden for fear of criticism. Here it goes:

    “The meeting between Jacques Bergier and Fulcanelli took place in June 1937 in a laboratory of the Gas Board in Paris. According to Neil Powell, the following is a translation of the original verbatim transcript of the rendezvous. Fulcanelli told Bergier:

    ‘You’re on the brink of success, as indeed are several other of our scientists today. Please, allow me, be very very careful. I warn you… The liberation of nuclear power is easier than you think and the radioactivity artificially produced can poison the atmosphere of our planet in a very short time, a few years. Moreover, atomic explosives can be produced from a few grains of metal powerful enough to destroy whole cities. I’m telling you this for a fact: the alchemists have known it for a very long time… “I shall not attempt to prove to you what I’m now going to say but I ask you to repeat it to M. Hellbronner: certain geometrical arrangements of highly purified materials are enough to release atomic forces without having recourse to either electricity or vacuum techniques… The secret of alchemy is this: there is a way of manipulating matter and energy so as to produce what modern scientists call ‘a field of force’. The field acts on the observer and puts him in a privileged position vis-à-vis the Universe. From this position he has access to the realities which are ordinarily hidden from us by time and space, matter and energy. This is what we call the Great Work.'[13]

    When Bergier asked Fulcanelli about the Philosopher’s Stone, the alchemist answered: ‘…the vital thing is not the transmutation of metals but that of the experimenter himself. It is an ancient secret that a few people rediscover each century. Unfortunately, only a handful are successful…'[14]” –

  • marianne

    … And right on cue… yet another “hidden SUSY” paper on the arxiv today. And so begins the onslaught of another thousand hidden-SUSY papers. sigh.

  • Fireworks below 1TeV

    @OhDear, thanks for your comments.

    But I can assure you, I am not angry at all. Actually I am feeling refreshed, rejuvenated, and excited by the wonderful results coming out of the LHC. Over the last week or so, we have gone from a 20 year state of ignorance, where we were told fireworks would happen at a few 100GeV and the standard model would fall apart to be replaced by gluinos, squarks etc, to a state of knowing that there are actually no fireworks below 1TeV and the standard model keeps working. This rules out the simplest and most natural versions of low energy susy, plus a 1000 other models. This is great knowledge.

    On the other hand, the only anger I’ve seen on this forum has come from a few phenomenologists who wish to advocate their 1000th complicated/contrived, detector defying susy model, rather than just studying the latest data and learning from it. (There are infinitely many complicated/contrived detector defying NON-susy models as well, why not write a paper about each of these too?) Matt Strassler appeared angered when he said “I have a bone to pick” with John and he is “very worried” by the work of the experimentalists. Plus many other phenomenologists quickly jumped to advocate their latest detector defying, R-parity violating models, with bones to pick, with worry, nervousness, uncomfortableness, and anger at the LHC results/experimentalists.

    However, I am feeling fine, and am happy to learn more about nature by studying the data as it comes in, rather than pushing my latest complicated model (with zero experimental support) on the hardworking experimentalists. Science can be fun sometimes, if you let it :)

  • Cole

    John B was right all along and Mariannne has come first to the attack with razor-sharp teeth. This is a blog to express scientific opinions where you can agree, disagree or add more useful information, not one to relieve veiled frustrations of self.

  • marianne

    @ Cole, I’m sorry for being so aggressive. I was reacting to the incredibly harsh tone used against the author (John Conway) of the post in the following quote in one of the above comments:

    “Honestly, it makes me very nervous to hear top-quality senior experimentalists such as yourself
    stating to the public that a thorough search for superpartners below 1 TeV has been performed,
    when in my view and that of other experienced theorists this is very far from true.”

    Such language is just rude and highly insulting (and all just to make a fairly obvious point and self-promote). There is a way to disagree or point out a correction without attacking the author. So, may be I overreacted to this and got a bit carried away and I apologize. But I was certainly not the first to attack.

  • Van

    In regards to whether or not low-scale supersymmetry exists or not at the present when the superpartners have yet to be observed, probably the most important thing to watch in the short term is the discovery or exclusion of the Higgs boson in the coming months. If the Higgs is found with a mass less than about 130 GeV, it is almost certainly the case that there is low-scale supersymmetry. Conversely, if the Higgs is excluded in this mass range then there is no way that the three-generation MSSM can be correct. So, it should be possible to have an extremely good idea within the next year whether or not the superpartners exists, regardless of whether or not they have been observed.

  • Fireworks below 1TeV

    Just to follow up on Van’s comments,
    now that we have ruled out the 20 years of claims of fireworks below 1TeV, I agree with Van that the key in the short term is the Higgs sector. I also agree with Van that a Higgs mass less than 130 GeV is favored by supersymmetry. However, I disagree that if we find such a Higgs we can immediately conclude that it is “almost certainly” due to low-scale supersymmetry. We would obviously need much more evidence than that – in particular we would need to detect several, and preferably all, of the 5 Higgs’s of supersymmetry, and with the right mass pattern. And further, we would obviously have to detect some of the squarks, gluinos, etc before we make such bold conclusions. On the other hand, if the only thing we have detected is a light Higgs, then it is still conceivable that the standard model is correct to very high energies. For mh > 126 GeV the standard model vacuum is absolutely stable, and for 115 GeV < mh < 126 GeV it is metastable, with lifetime longer than the age of the universe. So this makes the standard model still quite viable to high energies. And if it does breakdown, there are many non-susy possibilities.

  • Van


    Yes, I agree that the discovery of Higgs boson with a mass below 130 GeV would not prove supersymmetry, but would provide strong support for it. However, the main point is that if the Higgs boson is discovered in some other mass range, then supersymmetry would be effectively excluded, at least in the case of the MSSM with three generations. The only escape would be if there are more than three generations, and then the Higgs can be quite heavy from additional radiative corrections. In any case, the picture should be much clearer a year from now.

  • george james ducas
  • Phil

    CRACKPOT ALERT! CRACKPOT ALERT! At least two crackpots with nonsensical material have been detected. Evacuation on standby.

  • Steve

    OMG! Poor Phil belongs in a circus…! It’s amazing how he found enough brain power to get here after cleaning after the elephants. How did he do it?

  • somebody

    Thanks for the great post and the illuminating comments by many.

    “The major theme of the blog post is that new physics doesn’t seem to jump out in the current data. It’s really pathetic to watch people on here shamelessly promoting their latest silly model as an explanation.”

    Every discussion of this kind between experts involves some self-promotion, it is inevitable. But neither does that make the discussion worthless, nor is the situation as black and white as you claim. Opinionated theorists trying to cling on to susy at all costs vs. LHC experimenters who have ruled out all well-motivated susy models. Thats a false face-off. The point of this discussion thread seems prescisely that the message is messy at the moment. It is crucial that an idea like TeV scale susy, which has seemed reasonable to many smart people, be thoroughly investigated before being discarded. There is no hurry. It is good to keep in mind that we are all on the same team.

  • John Duffield

    All: take a look at “A Zeptospace Odyssey: A Journey into the Physics of the LHC” by Gian Francesco Giudice. He’s physicist at CERN with a hundred-plus papers to his name. He talks about the Higgs sector on pages 173 through 175. If you don’t have this book you can find it on amazon and do a search-inside on “Higgs sector”. He starts by saying:

    “The most inappropriate name ever given to the Higgs boson is ‘The God particle’. The name gives the impression that the Higgs boson is the central particle of the Standard Model, governing its structure. But this is very far from the truth.”

    On page 174 he says:

    “Unlike the rest of the theory, the Higgs sector is rather arbitrary, and its form is not dictated by any deep fundamental principle. For this reason its structure looks frighteningly ad-hoc”.

    He also says:

    “It is sometimes said that the discovery of the Higgs boson will explain the mystery of the origin of mass. This statement requires a good deal of qualification.”

    He gives a good explanation which I won’t go into. What’s important is that he finishes by saying:

    “In summary, the Higgs mechanism accounts for about 1 per cent of the mass of ordinary matter, and for only 0.2 per cent of the mass of the universe. This is not nearly enough to justify the claim of explaining the origin of mass.”

    So don’t be fooled by all the publicity about the Higgs boson. It’s intended to promote HEP and theoretical physics. Not to enlighten anybody.

  • Van

    Dear John Duffield,

    The Higgs mechanism exists in order to break the electroweak gauge symmetry and to provide mass to quarks and leptons through their Yukawa couplings. The term “God Particle” was invented by book publishers who wanted to sell books, not by physicists. Indeed it is true, that most of the matter in the universe is not generated directly by the Higgs mechanism. For example, the mass of the proton is mostly due to the sea of virtual quarks and gluons that exists within the proton. However, this does not in any way diminish the importance of the Higgs mechanism. So, yes it is hype to refer to it as the God particle, but it is not so to emphasize how fundamentally important the discovery of the Higgs boson would be to particle physics.

  • Thomas Larsson

    So Leon Lederman is a book publisher now.

  • Yeti

    @ somebody: The opinions expressed by “Marianne” and “Fireworks below 1TeV” are shared by many physicists. There are way too many papers that just play silly and unimaginative games with the SUSY parameter space. It’s just an easy way to write papers which just end up cluttering the arxiv. These papers tend to go something like this…

    1) We consider an interesting scenario
    2) This scenario may not be that great but it has interesting signatures(or can evade limits and we propose new search strategies)
    3) Run Madgraph and make plots
    4) End of paper

    Sadly, such papers are still needed to rule out or search for SUSY in the “entire” parameter space. But let’s not pretend that this is intellectually challenging in any way.

  • Van

    @Thomas Larsson: Leon Lederman wrote the book, but it was the book publisher who came up with the term “God Particle” and title of the book. Lederman has stated that he was not happy with this, but went along with it. More specifically, the actual term used by Lederman was the “God-damn Particle”.

  • John B

    It should be obvious by now that although in certain elite circles (use your imagination to find out where) physicists are employing alchemists, yes, I said alchemists (that supposedly archaic and superstitious science that supposedly didn’t discover anything useful, especially in particle physics), the vast majority of physicists are been kept in the dark in regards to this while they, like useful little lead soldiers that keep discussing an infinitude of stipulations about theories that have not even been proven and which they are not even sure which to believe, they make theorems about these theories, taking them for granted and with plenty of pride since that makes them look and feel very knowledgeable! – But, Knowledge of what, of nothing really tangible? If you justly want to call yourself a physicist you better analyze ALL THE FACTS, no matter how ridiculous it may sound to the masses, due to ignorance, act like a true scientist, and stop hiding behind dogmas that only slows down science almost to a hilt.

    If I ruffled the feathers of some people, tough. Sometimes people need to be shaken down a bit in order to put their neurons back in line again even if they have university degree up high on a golden pedestal.

    Note: some of these same people know that I am right about all of this but are too afraid to speak out in fear to be characterized as loonies or something similar. It is a lot easier to follow the majority even if they are wrong that to step aside and look for the truth no matter how difficult or painful it may be to them because they will have to swim against the current.

    If I am to be characterized as a loony myself (either I will be characterized this way or ignored) by a real loony because is totally ignorant of the FACTS, that will be a compliment and I thank him or her in advance. Nobody had touched this blog until I started posting in it and all I see, with a few exceptions, is a lot of rhetoric, spilling of hate, implied bursts of “I know more than you”, etc.

    In addition, if you feel that I have not added anything useful to this discussion, you are wrong, since the elusive higgs boson wrongly called God particle, and the “legendary” philosophers’ stone are the exact same thing and while physicists are scraping down all they can to find just one little particle, ancient and not so ancient scientists, have been able to obtain it in term of a few ounces at a time. By the way, I almost forgot, alchemists have always claimed that they could make transmutations and “scientists” made joke of them since that was supposed to be impossible – Now transmutations are very common in nuclear reactors due to radioactive decay. You won’t hear any of those scientists saying “ups! I’ll bite my tongue” and you never will – they don’t want their little golden but very fragile box to shatter – which means that their self steam is not strong enough to take it like a man or woman.

  • Shantanu

    so are most extra-dimensional models are ruled out?

  • Fireworks below 1TeV

    extra-dimension models that solve the heirarchy problem, should have provided signatures at a few 100 GeV, do look to be ruled out, yes. (Although super tiny extra dimensions, say at the Planck scale, that are irrelevant to EWSB and the heirarchy problem, are still allowed).

    So this kills Randall-Sundrum models, Arkani-Hamed, Dimopoulos et al models etc. Its interesting to think that the RS paper has 6000+ citations, the 9th most of all time according to spires, and has lead to 6000+ follow up papers, has nothing to do with reality.

  • anon.

    Oh, give me a break. Precision electroweak and flavor constraints already told us that any new model of strong dynamics (like Randall-Sundrum) shouldn’t have resonances below a couple of TeV or more. That could still turn out to be the way the world works (perhaps with a composite Higgs….).

  • Phil


    The RS paper has everything to do with reality. For example, the follow up papers were necessary for the authors to survive the publish-or-perish aspect of the reality of academic life. That’s reality.

  • Fireworks below 1TeV

    @Van 42,
    your reason that the Higgs exists to “break the electroweak gauge symmetry” is a common misconception, but is incorrect.

    It is meaningless to “break a gauge symmetry”. The point of a so called “gauge symmetry” is to allow us to describe massless vector bosons in a manifestly Lorentz covariant way by eliminating the extra unphysical degrees of freedom. So it is not a real physical symmetry, with a corresponding Noether charge, that can be broken. Instead the only real physical symmetries that exist and can be broken are “global symmetries”.

    Now, you can actually write down a Higgsless standard model (but including the 3 Goldstone bosons that are connected to the longitudinal polarizations of the W and Z bosons), with full electroweak gauge symmetry, whose global symmetry is spontaneous broken by any choice of vacuum. In such a theory, the gauge symmetry is non-linearly realized. The point of the Higgs then is to promote the non-linearly realized gauge symmetry to a linearly realized one. This also promotes the theory from non-renormalizable to renormalizable, and promotes the theory to a UV complete theory that avoids unacceptable hard scattering, such as WW scattering unitarity violation.

    So this is the actual reason why the Higgs exists.

  • Yeti

    The RS model, SUSY, Technicolor, etc are all really good groundbreaking ideas. If nature doesn’t correspond to any of these scenarios that’s just bad luck for these models. Good model building addresses serious outstanding problems and provides elegant and consistent solutions. Whether or not nature chooses to realize these solutions is not up to us. In the absence of data this is the best good model builders can do and that’s what they did the last couple of decades.

    On the other hand, bad model building takes these great groundbreaking ideas and then make small silly variations in the details, run them through Madgraph, make plots, and post to the arxiv… hoping that something will stick. Sadly this is most of the crap that is on the arxiv and corresponds to the 6000+ citations received by the original RS paper.

    But that’s just the way science works. There are good physicists out there that take giant leaps forward or at least make important/valuable contributions and the rest squabble over the minor details until it is all sorted out by data.

  • Shantanu

    Thanks , fireworks below fireworks below 1 TeV
    Yes its sad that there is so much interest in extra dimensional models even though they looked
    quite ugly.
    OTOH almost no one is working on things like Koide mass formula, torsion based extensions to standard model

  • Van

    @Fireworks 51: Yes, you can get into very technical details on exactly what is meant by symmetry breaking, however my point is that this is the main phenomenological function for the Higgs mechanism, namely to spontaneously break local gauge symmetries. By this, I mean that one or more of the gauge bosons of the original gauge symmetry becomes massive. In the case of the SM, the SU(2)_L x U(1)_Y gauge symmetry is broken to the U(1) of electromagnetism by generating masses for linear combinations corresponding to the W and Z bosons, leaving behind a single massless linear combination, the photon. Then of course, there is one degree of freedom left which is theh Higgs boson, which must exist in order to preserve the WW scattering unitarity as you’ve pointed out.

  • Fireworks below 1TeV

    @Shantanu 53,
    I don’t think it is “sad” that people worked on extra dimension models, and not all of the models “looked ugly”. In fact the original RS paper is quite a wonderful and thought provoking paper. But the lack of any experimental support makes it difficult to understand the 6000+ follow up papers. Its not easy to understand why the RS paper has more citations than many other deep papers that describe reality. To name a few papers with less citations: the Glashow, Salam papers on the SM, the Gross/Wilczek/Politzer papers on asymptotic freedom, the Guth paper on inflation, the Wilson paper on renormalization, the Adler, Bell, Jackiw papers on the U(1) anomaly, the Goldstone and Nambu papers on symmetry breaking, Hawking’s black hole radiation paper, Cabibbo paper on mixing, Schwinger paper on vacuum polarization, NFW paper on galaxy profiles, Coleman-Weinberg paper on loop corrections, tHooft, Veltman on non-abelian renormalization, the Higgs paper, SDSS galaxy surveys, various experimental papers on particle discoveries, some of the WMAP CMB acoustic discovery papers, some of the supernova/accelerating universe discovery papers, particle data book, superkamiokande neutrino oscillation paper, and the list goes on and on. Apparently RS inspired more follow up papers than all these? Interesting…

  • Fireworks below 1TeV

    On the other hand, the theoretical astrophysicists have played their own weird model building game, and though it maybe hasn’t gotten quite as out of control with citation count, it has been based on much worse papers. Theoretical astrophysicists have played the modified gravity building game for many years, with absolutely no success. But unlike RS models, which are at least well defined and interesting, many of the modified gravity models are complete nonsense, which inevitably violate some basic feature of the world, such as effective field theory, local Lorentz symmetry, quantum mechanics, stability, diffeomorphism invariance, causality, equivalence principle etc. Models such as 1/R gravity or DGP gravity appear to not only be inconsistent with observations, they are deeply flawed theoretically. These papers have ~ 1000 citations, despite being in violation of basic facts about the world (for instance, DGP violates causality). Compared to this, I have a lot of respect for the RS models – though they appear to be ruled out observationally too.

  • Phil

    What does DGP stand for? Also, what about f(R) theories of gravity? Are they in the same group as modified gravity? What do you think about f(R) theories? I’m just curious.

  • Yeti


    The reason the RS model has more citations than the other papers you mentioned is simple. Most papers you mentioned were written in the 60s and 70s and their content is now in standard textbooks.
    So, we no longer cite those works because it is now considered standard knowledge. In the 60s and 70s the arxiv did not exist and papers were sent in by postal mail etc. The were far fewer physicists back then compared to now. The rate at which new papers are written is much larger today compared to the 60s and 70s. So, for all these reasons citations were generally lower back in those days and no one cites these old works anymore since they are now textbook knowledge. For example, Einstein’s famous papers on relativity have extremely few citations.

  • Fireworks below 1TeV

    @ Yeti
    for the most part, this is incorrect. Anyone who writes a paper and refers to asymptotic freedom cites Gross/Wilczek/Politzer, anyone who refers to inflation cites Guth, anyone who refers to Hawking radiation cites Hawking, anyone who refers to WMAP data cites WMAP, anyone who refers to SDSS galaxy data cites SDSS, anyone who refers to NFW profiles cites NFW, anyone who refers to supernovae data cite the supernova team, anyone who refers to superkamionkande neutrino data cites superkamionkande etc. On the other hand, I did not mention Einstein’s papers or anything super old, so this seems pretty irrelevant.

  • Yeti


    Most papers do not cite Gross/Wilczek/Politzer anymore even though they routinely use the concept of asymptotic freedom. Otherwise every single paper even remotely related to related to collider physics would have to cite this paper. Any calculation in collider physics relies on the concept of asymptotic freedom. Same thing for many of the other papers. The papers that receive a lot of citations are ones that propose new ideas which generate excitement at the “forefront” of research. Ideas that have been around for a long time and are textbook knowledge are usually not cited by most papers. Otherwise the reference section would be at least 10 pages long if you insist on citing every single concept used in the paper dating back to the 60s and 70s.

    Citations are not always the best measure of the impact of a paper; especially when you are comparing papers written at very different times. But your point is well taken… sometimes certain papers get cited more than others for arbitrary reasons… like what the current fashion trend is in the research community.

  • Jeh-Tween Gong

    I would like to talk about common sense on this Higgs issue.

    There are a few things certain about the Standard Model (SM).

    Certainty 1: SM is not complete, that is, with a big hole. There are two ways to deal with any hole, plugging it up or going into it. By plugging it up, it becomes complete. By going into it, we will find a new universe.

    So, what is this Higgs, a plug or a tunnel?

    Certainty 2: SM roams in the space-time, not a base for generating space-time. So, if SM becomes a complete sheet (without any hole) by finding the Higgs, it still does not encompass the space-time. Thus, finding a Higgs plug will not make SM a better physics theory. If this Higgs plug cannot plug the entire hole (SM still incomplete with a real Higgs), then this Higgs is not truly important for a vital theory. Can Higgs be a tunnel? If it is, then it should have an internal structure.

    There are two types of internal structure, with sub-particles or with pre-particles. The sub-particle is, in general, smaller than the particle. On the other hand, the pre-particle can be infinitely larger than the particle. For example, a visible iceberg is composed of three pre-particles, 1) a big chunk of ice with 10 times of its size, 2) a big body of water, 3) a big space encompassing the two above. Each of these pre-particles is bigger than the visible iceberg, and some can be infinitely bigger.

    If quark is only a protrusion of the space-time sheet, then there is a lower tier under SM. That is, the hole of SM is actually linked to its lower tier, the space-time. In this way, SM becomes a major part of the whole as its hole is not a weakness but is the key link to completeness.

  • Fireworks below 1TeV

    @ Yeti
    Firstly, the few papers I mentioned span the 60s, 70s, 80, 90s, and 2000s – including papers written more recently than RS. And I could easily add many, many more important recent papers to the list. So I disagree with your point here.
    Secondly, I don’t think it makes sense to not cite a particular paper, such as asymptotic freedom, because it is essential to collider physics, while citing a particular paper, such as RS, which has proven to be completely irrelevant to collider physics and has no connection to reality. This seems a pretty simple point that I am making here
    Thirdly, it is odd that you keep saying the “explanation” is that it is at the “forefront” of research. I think this is tautologically true. You might as well say “the explanation that RS is cited 6000 times is that RS is cited 6000 times”. But the non-tautologically true fact is that the content of the paper has proven to be disconnected from reality, and therefore many, many people are working on a topic disconnected from the real world. Now is this a fact with no explanation? Of course it has an explanation (like everything else), and it is to do with many people writing many bad papers. I don’t think it can simply be explained away by historical accidents as you claim.

  • Phil

    @ Jeh-Tween Gong,

    Your ramblings make no sense. Perhaps that is why your book, “Super Unified Theory: The Foundations of Science” is out of print on amazon. Cheers!

  • Yeti


    at a certain point, what used to be knowledge at the forefront of research, becomes standard textbook knowledge. When this happens, papers no longer cite this textbook knowledge. Asymptotic freedom is taught to grad students in the classroom and it is not considered necessary to cite this work anymore. Just like we don’t cite Newton for his laws of motion because it is considered “standard knowledge”.

    Asymptotic freedom has had an astronomically large impact on particle physics. The relative impact of the RS model is not even close. However, this is not reflected in the citation count. So, you have to be careful when interpreting citation counts… they are not always a reliable measure of the impact of a paper.

    I don’t want to argue on this anymore. If you still disagree, then let’s just leave it at that…

  • Fireworks below 1TeV


    well, given that I mentioned various papers, some older than RS and some newer than RS, and that many of these papers are still cited routinely (such as the original Guth paper or WMAP papers), it clearly negates your point.

    So I think it is indeed appropriate to just leave it at that…

  • Fireworks below 1TeV

    @ anon 49

    Firstly, of the 6000+ RS models, there are many that evade the electroweak and flavor constraints, but would have shown up at the LHC. These set of models have now been ruled out by the latest LHC data. So you’re incorrect about this.

    Secondly, its very strange that you are pointing out other observations that already disfavor RS, in addition to LHC, and using this to conclude that RS may be correct! Wow, this is the exact opposite of how one should draw inferences and conclusions and perform statistical analysis in science.

  • george briggs

    when is somebody going to realize that the higgs is just dark matter of SU(1) symmetry, and together with ordiary matter forms fermibosonic matter of E8 symmetry.

  • Shantanu

    Fireworks(@48), I am not complaining about the large no of citations of RS, but just that
    many other ideas are completely ignored and people who work on them have hard time getting jobs or even conference/seminar invitations.
    BJ Bjorken made the same point in an after dinner talk at a recent symposium in FNAL.

    As a point how many particle physicists are working (or even read) Grisha Volovik’s ideas about cosmological constant or on Einstein-Cartan gravity? Compare that to the amount of hype work on string theory landscape has got.

    Also regarding super-k results on neutrino oscillations all sorts of lofty claims were made when the announcement was made in 1998. Pierre Ramond even claimed that it provide evidence for low-energy supersymmetry.
    However when you probe it much deeper, you will find that its not even evidence for physics beyond standard model.

  • Jeh-Tween Gong

    There are two good things about common sense rambling; 1) it can be understood by everyone, including physicists (no proof is needed), 2) it can easily discuss the BIG picture in addition to the nitty-gritty.

    At this moment of great discovery at LHC and Tevatron, I would like to offer some common sense ramblings to paint a big picture.

    The works of web page and websites are now common sense knowledge. Every web site has a home page, and it could have some internal pages, often hyperlinked from the home page. Sometimes, those internal pages come out as Popup Windows, and I would like to call them as pops.

    Our universe can be viewed as a website. The Big Bang is the home page, the Grand Daddy Pop (the GD pop). Under this GD pop, there are a few other pops, the Life pop (life sphere), the Intelligence pop, the Linguistics pop, the Mathematics pop, etc..

    Every pop (page) has, at least, two types of context; 1) some texts, 2) some hyperlinks to the other pops. For some internal pops, they can be the end of links without any outgoing links. However, for the GD pop (being the Grand Daddy, the home page), it must have outgoing links to reach all internal pops. In fact, it is those outgoing links defining the GD pop.

    There is no need to prove that the Big Bang is the GD pop. If anyone disagree with this, then so be it. As a GD pop, it must have links to all other pops. That is, the vitality of a GD pop “theory” can be easily checked with the following checklist.
    a. Does it link to (giving rise to) the Life pop?
    b. Does it link to (giving rise to) the Intelligence pop?
    c. etc..

    This Big picture of pop-links was ignored for long time as our physics theories were too primitive. Now, we are reaching the gate of a final theory, and these pop links must become the final checklist. Physics can no longer hide in an isolated page. Does Higgs encompass or complete all those pop links?

  • George James Ducas

    All things are products of time and velocity. Equivalency means that all things through all dimensions are products of time and velocity held together by a math or matrix and related as scalar, (additive), proper (multiplicative) within various dimensions connected through states of resonance and induction. Space is made from loops of a velocity and time matrix, woven together with properties almost like a fluid. The geometry, compression, and configuration of this through multiple dimensions of inductive states through dipole action result in a continuous continuum that is multidimensional. So therefore the atom is the perfect Trans dimensional model, exhibiting all states and transitions of the matrix or space, so there is no distinguishing between the cores or the space. The particle is merely the overlapping of the matrix in nodes of interference. The centres of all particles are the singularities, of the boundaries that go from the interval to infinity. One of the controls is the number pi. The motion of these various dimensional or resonant states act together one relating to another. So therefore the entire system of space and particle is an inertial field or matrix. All these relationship at the smallest level are products of time and velocity. So then the idea of mass, particularity used to define force and energy, are arbitrary assignments. Since the smallest denominator has nothing to do with mass directly, and the defining of systems of ether and atomic states in terms of a larger configuration or “energy” in joules is not periodic, meaning not in system of the assignment of parts and properties as we have in a periodic table of chemistry. If we redefine all properties as arising from three units, to include v and t, then the associated properties are an adjunct to the real structure, and that v or t is the energy (new word) that is in all things, and particles associated with various states. The Higgs is more a test of the system of units, the defining of mass, than the finding of a particle.
    The evaluation of all particles in terms of mass and energy is linear and not periodic. Mass is a higher periodic state than velocity and time which are at the bottom. v and t as the basics and through the periodic development of v and t we finally get the effect of mass. Therefore to evaluate those things which arise out of more simple relationships of vt, such as mass less particles, with something much more evolved, such as mass, is not providing the clarity to the structure of the real universe.

  • George James Ducas

    The term energy needs a new definition. We see energy as joules, but i see it as any configuration of v and t. so then it is viewed by the lowest denominator rather than one so high. Trans-dimensional equations have only three units, where time and velocity are the lowest form. If you want to talk about these components of v and t and the new basis for understanding “energy”, then you have “energy” at all levels, not to be mistaken with joules. so then if mass is D^3T^2 or V^3T^5, we see that it is an asymmetrical form, and when we define it this way, we also understand that Higgs is really a particle related to this form, but then we also realize it is not the smallest form or the smallest thing lending to mass, only a particular configuration by which we define mass, since all v and t gives rise to everything, just depends on arrangement.
    Our definitions should only be by v and t, and by arrangement associate properties, almost like a periodic table of chemistry. So then we say this configuration is mass, or we might even revise our use of the property mass. Various energy states can be described by resonant action D, related to v and T. various resonant states multiply together to generates other state properties. The geometry and compression of dimension of these states or matrix also give rise to other properties.
    V + T = D = 1 (scalar), this derives quadratic forms and polynomials
    V X T = =D = -1 (proper)

    that comes from
    phi + phihat = 1

    phi X phihat = -1

    phi is the golden mean and phihat is 1/phi

    time and velocity are the same energy polarized and inverted, and together form resonant activity D

    below this periodic triad is a binary code of 1 and 0

    pi controls all activities of time and velocity and is also responsible for the fine structure constant and creation of all numbers
    everything is made from time and velocity, particles are merely the interference patterns of those matrices.

    The search for Higgs is more about units than particles, since mass is purely arbitrary, an assignment to understand energy and force. Mass = D^3T^2 = V^3T^5, which shows by our definition mass to be asymmetrical. So what particle perhaps is associated with this asymmetrical definition? Maybe the reason its so hard to find, and maybe the definition is not attached to any particle.

  • George James Ducas

    Different particles are associated with different dimensional states. So it is incorrect to use joules or D^5 to examine particles that are associated with lower dimensional forms where mass disappears. The word energy needs to be redefined into time and velocity. you can take any equations and convert or rewrite it in terms of v and t. you only needs three units d, t, and v. the current system of units is like the Japanese kanji of symbols and does not have the periodicity or periodic table that represents the true structure of what is being examined, pattern of nature. so then by examining all nature through a specific form rather than its true pattern, you get all kinds of contradictions, even to the point of thinking that energy is everything, but your definition is only one form, and not even a small form. When you look at an atom, mass is only a small percentage of what can be measured, but for you, it’s useful to define energy and force. Obviously if it works use it, but that isn’t the complete structure of nature, and mass is purely arbitrary since its only given meaning to other definitions that are useful to you. If i were to say that everything is time and velocity, within your system of measurement it seems to go beyond, but also include much more. so therefore then, having a system that is not complete, analysing in terms of joules and newton meters, is going to leave a lot out of the picture, since it’s not a periodic system that defines energy even at the utmost smallest level, you get these ideas if massless-ness in particles, etc. but then if you have the periodicity and pattern of nature, it applies to all states, and nothing is without, only comprised of smaller arrangement of the same smaller forms, v and t.

    C describes the maximum extension of the entire system of the inertial field of localized space

    D^5 = mc^2, calculate D radius for the maximum volume of an inertial system. What an equation means depends on how you define the variables in the equation. We call them constants, but in fact are variables. Anything with units is a variable. In the Schwarzschild radius we introduce G with a number, and hence change the relationship to a different scale, but the equation is the same.
    Mass can be broken down further, mass = D^3T^2 = V^3T^5
    You can say energy is equivalent to the products of time and velocity
    Mass as a definition is arbitrary, but everything is a product of time and velocity

    Energy = D^5 = mc^2 = D^3T^3V^2 = V^5T^5
    As we see, energy is a vector
    Energy as joules is the product of velocity and time

    E = C^5T^5 energy is equivalent with time

    When you talk about mass energy equivalence, it brings to mind a greater question, the equivalence that provides a unified field uniting all concepts in physics. I say to you all properties in physics are products of velocity and time. C is a special case of the maximum extension of that field of resonance. But the equation is wrong, since equivalency has to do with that energy which explains all states, and that equivalency is not limited to one condition, but to many. The idea of C was only one idea in this respect. And I can show that mass is really the product of more variables of velocity and time, arbitrarily chosen only to define what you know to be force and energy. Since c does not explain a unified field, it does not explain mass and energy equivalence. It is merely a constant and combined with mass is a product that defines energy useful to us by measurements. Energy as defined in joules is not the smallest property to define anything since it can further be subdivided and can be explained as the product of velocity and time. the idea of an energy whose products accounts for all properties via a periodic structure that represents the real form of nature is v and t, together forming d of a resonant system. What is equivalent describes all forms and equations. The idea of mass energy equivalence is misleading because from it there isn’t the simplicity or explanation of a unification of physics. Energy involves the acceleration of density and volume of the inertial field. E = Volume x Acceleration^2 x Density ^2, breaking down equivalency to time and velocity allows the true equivalency of all concepts. Equivalency is infinite, and what we see in this one equation of mc^2 is only a special case where by it, we have incorrectly assigned energy as the smallest denominator in assessing all measurements, when in fact it’s high on the list being product.

    All form is the result of resonances of space or D, which are products of velocity and time. Particles are merely the interference patterns of those resonances and can be associated with various matrices associated with resonances. In certain conditions it is clear that certain particles are associated with resonances, but then again when at scales that subdivide those resonances, those measurements fail since the subdivisions and associated particles are less than the products that resulted in said resonances. But all form is the result of matrices of velocity and time acting in various dimension of resonance D of the ether space.

    The concept of ether field means that there would be propagation of effects and action at a distance, but also since due to the inductive nature of fields there also has to be immediate action since the existence of 1 is tied automatically unto another in v and t. The differences between the natures of related fields are merely functions related to time and velocity. The difference between magnetic field and current is time and velocity respectively, and the similarity is D or resonance space. The force is a product of space or D. so maybe in evaluation of those differences, some are instantaneous and others a propagating, and thus from my point what has value in assigning which as special, related to time, and related to velocity, has meaning through propagation and induction. And then perhaps a propagating resonant field does both, act at a distance and those that are instantaneous are related to time and velocity. In my theory I refer to proper systems which involve multiplication, and also referring to v and t (velocity and time) within a single system inertial field. This would be instantaneous, since v and t coexist; one does not exist without the other. And it is said that energy is not a vector, but I say that since velocity and time are vectors, and because energy is a product of vectors, then energy is also a vector. E = D^5 = mc^2 = D^3T^2V^2 By scalar I mean addition and condition involving more than one inertial field or part of field where the measurement is outside the system of a single field. This would involve quadratic, polynomial additive forms as well. Hence you have systems of unity of single systems and systems of propagation between multiple systems. Under such conditions you would have instantaneous and propagating systems. I would like to say that my definition of multiplicative is proper and additive is scalar or relative, and that all components, since they can be broken down to v and t are all vectors and products are also vectors. The fields are real. Because you have scalar and proper systems, then you have the duality, mass increasing with velocity and also decreasing with velocity, depending on whether it is proper or scalar.

  • George James Ducas

    Well if time slows down, I definitely don’t want to approach the speed of light unless I was Cinderella and then the carriage never becomes a pumpkin. So how am I going to leave the ball and get home without all that embarrassment without approaching the speed of light? Of course as little bug creatures we are, we addicted to speed. you cannot travel faster than the elastic collision of the medium, and for the ether superconducting fluid of outer space that is C. long before that you will be catapulted to hyperspace D^4 at C^2, then you will have problems with time travel and position since you will be across the galaxy in 10 seconds. The measurement of light is a scalar phenomenon that relates as follows
    V + T = 1 = D
    (V^2 – 1) + (T^2 + 1) = V^2 + T^2 = V^2 + 1/ V^2 = 1 = D = VT; @ T=1, V = C
    V^2 + 1/ V^2 = C
    Phi^2 + 1/ Phi^2 = 2.99
    Within a single system, that photon vibrates at 2.618, but the measurement is scalar near 3.
    Zero is a myth, for example

    T^2 + t – 1 = 0, here we see time in relationship to 0, refers to a system of unity and transformation, operators present. The equations of time vary but there is no zero. At such levels time as we measure it as a property does not exist at all, since everything is a product of time and velocity.
    So hyperspace will get Cinderella out quickly, but the device to do that is much more complex. I prefer teleportation, you assign a matrix that duplicates the pattern of what is being sent elsewhere and it will appear there, since no 1 identical signatures can occupy two locations.
    C sets the scale of measurement and is a relative product.
    Well, constants set the scale that should be no mystery. And the measurement of C is relative since you are doing it from outside the inertial field of reference, a relationship which I define a V^2 + 1/V^2 at T=1 is C. and so the question is no mystery as well. all things have equivalency, it merely seemed to start with E = mc^2, and that equivalency is to translate all things into V and T. the fact that it started with mass is simply a historical necessity, and mass as an assignment is purely arbitrary. Everything is a product of resonant function D which has an asymmetrical assignment of v and t. mass which you have assigned is but a portion of products of functions that are within an atom, and at best perhaps purely arbitrary. But for the manner in which you measure to get a limited handle on the nature of an atom, it works for rudimentary exercises. So it’s not necessary to make the question about equivalency so profound when the axioms are wrong to begin with.

  • George James Ducas

    all physics and this universe can be reduced to 2 equations

    phi + phihat = 1 = velocity + time = D and

    phi x phihat = -1 = velocity x time = -D

    from these i can derive all equations

    all is resonant action d reflecting the wave nature of all things.

    Equivalency is identified for all relationships since the entire universe are products of time and velocity

  • Bob

    What is going on in this discussion?

    Fireworks below 1TeV points out (quite rightly) that it is unfortunate that the Randal-Sundrum paper has more citations than almost everything else, such as asymptotic freedom and some of the wmap papers. Then Yeti disagrees by saying that it is as silly to cite those papers as it is to cite Newton.

    Citing wmap is like citing Newton????? (so says Yeti)

    Its astonishing how illogical some people can be.

  • Fireworks below 1Tev

    @Phil 57
    DGP stands for Dvali Gabadadze Porrati. They invented a weird 5d model in which our world is confined to a 4d brane. They assumed the CC=0, but obtained acceleration anyhow by choosing a parameter in the model accordingly. For a long time it was known to be fine tuned, possibly ill behaved quantum mechanically, did not address the CC problem, and had no UV completion. But the final nail in the coffin was when it was shown to violate causality. Despite these major problems it attracted (and continues to attract) much interest among astrophysicists and has over 1300 citations. On the other hand, the hep-th crowd has largely ignored it.

    f(R) is sort of modified gravity. It can be rewritten as ordinary gravity plus a minimally coupled scalar field in the Einstein frame. In such a frame, it doesn’t obviously deserve to be called “modified gravity”, but technically it is I suppose, because the scalar couples to matter in a special way – namely via the trace of the stress-energy tensor – which is how we think of gravity. Such models are often built to describe dark energy, such as the 1/R model, but invariably are at odds with solar-system tests of gravity, because the scalar introduces a long-ranged 5th force which causes light to bend in a way that is different to matter (since the trace of the stress energy tensor for light is zero, but not for matter). There are extremely contrived models that side-step these problems, such as chameleon models, which don’t make much sense from an effective field theory point of view.

    I suspect that ordinary Einstein’s gravity with a CC (or possibly, but much less likely, an extremely weakly interacting slowly rolling scalar), plus small derivative corrections, is probably the only sensible theory of gravity that is consistent with all observations, that is internally consistent, and emerges as the low energy effective description of any plausible fundamental theory of quantum gravity.

  • melior

    Actually, the golden mean *minus* its reciprocal is 1; their sum is something like 2.236. And any non-zero real number times its reciprocal is +1 not -1, according to the definition. The rest was tldr; but do let us know the results of adjusting your TOE to reflect this simple sign error at the outset.

  • Jeh-Tween Gong

    With the recent data from LHC, the Higgs game is de facto over although some final verifications on a bump around the 120 Gev are still pending. And, down into the tube, it also goes the supersymmetry fantasy. Now, the only viable theory is the Prequark Chromodynamics (PC), as the M-theory is only a subset of the PC.

    While Higgs was not officially ruled out by Prequark Chromodynamics, there are, at least, four big reasons for Higgs being a very bad idea.

    1. Higgs is not needed in Prequark Chromodynamics.

    2. Higgs (if being a rock bottom particle) violates the bottoming principle, explained in the book “Linguistics Manifesto”, available at Amazon.

    3. Higgs is useless to link Standard Model to other realities, the Life sphere (pop), the Mathematics Universe (pop), etc..

    4. This physical universe sits on three pillars, 1) the Cabibbo angle (θc) ~ 13.5 degrees, 2) the Weinberg angle (θW ) ~ 28.73 degrees and 3) the Alpha [“electron fine structure constant” (Beta = 1/alpha = 137.0359 … )]. Higgs can do nothing to derive them while they can be easily derived in Prequark Chromodynamics.

    As the Prequark Chromodynamics did not become a mainstream theory, I should give a brief outline on its history here. The first paper on PC was written on December 4, 1979. Then, a book “Super Unified Theory — the Foundations of Science” was copyrighted on April 18, 1984. Many reviews about the Prequark Chromodynamics were listed in that book. I will list some key reviews below.

    a. Jon Machta (in March 2004) invited me to join the faculty at the University of Massachusetts, Amherst.
    b. Jainendra K. Jain (in February, 2004) invited me to join the faculty at PennState.
    c. Steven Weinberg (in 1996) — the outlook for Prequark is not good.
    d. Alan Harvey Guth (in 1994) — it [prequark] is very interesting.
    e. John Archibald Wheeler (in 1994) — it [prequark] has no test point.
    f. Sidney D. Drell (in 1983) — invited me doing research work at SLAC.
    g. The Physical Review (D. Nordstrom, in October 1983) — We have made no judgment on whether your work is correct or not, only that the subject matter is not suitable for the Physical Review.

    Up to this point, the Prequark Chromodynamics is, in fact, just a one-man show. Now, I would like to invite all of you to join in.

  • Mark F.

    Hey Gong,

    I will list a key review about your Linguistics book below.

    a. Trey Jones (in 2010) — Linguistic hogwash

    I’ll take Trey Jones’ word over yours. I don’t know how you come up with all this crap, but you seriously need to get a life and stop throwing up all your crap on this blog.

  • Shantanu

    Btw here is a must watch talk by Avi Loeb. Although this talk is mainly astrophysics/cosmology centric, the gist of the talk also applies to particle physics

  • David Brown

    U(1) gauge coupling: .357
    SU(2) gauge coupling: .652
    SU(3) gauge coupling: 1.221
    ((1.221 + .652 + .357)/3)**(1/64) = .99537618…
    I conjecture that M-theory has a fuzzy energy tensor formulation. However, the preceding estimate is further from 1 than I anticipated. Does anyone have a theory that can explain the arithmetic mean of the 3 gauge coupling constants?

  • A Sean

    Fireworks below 1 TeV in didactic mode (e.g.@76, @51) is easy to read and follow. He or she should have a blog somewhere; perhaps with an entry on interesting papers in theoretical physics (and why they are important, even if not necessarily right), including at least some of those that are listed @55 & @56. [ On the other hand, his or her choice of a topical pseudonym and the direct tone in some other of the comments here were entertaining. ]

  • Lucy Haye

    The Higgs is an old fantastic invention as the Neutrino that doesn’t exist because the SR’s equations are not applicable to DECAY, where the Energy Conservation fail.
    The fantastic Higgs is the heat as a “fluid” from the time of the medieval alchemy.
    See the New Paradigm that send Science again to the Common Sence of Galileo-Newton and the Scientists of the 19 century.

    Lucy haye. Ph. D.
    SAA’s representative

  • george briggs

    The higgs is just dark matter, bosonic, spin o, what is not so well known about it is that it is also gravitationlly repulsive and of negative energy. it forms a stable fermibosonic particle in combination with an equal absolute mass of ordinary matter(fermionic).this is the entity that passes between universes to supply matter to the new universe. note that it is of zero net mass as required by higgs theory.

  • george briggs

    re #83 we certainly need a modern galileo to tell us that the idea that all mass came into our universe at the time of the big bang is hogwash. it came in slowly with the growtth of galaxies and was originally of fermibosonic form and had zero or very small net energy (mass).

  • Lucy Haye

    Re # 85 The modern Galileo exist and from the same country: Dr. R. L. Carezani with Autodynamics (AD) New Paradigm in Physics-Cosmology without FANTASIA; really with the Galileo-Newton and the scientists of the 19 Century COMMON SENSE. The Big Bang is a TALE for children and absolutely anti-scientific since it is a pure invention by Creationists
    See please: The Autodynamics Concept of Mass
    And the Big Bang FANTASIA of the 20 Century

    Lucy Haye Ph. D.
    SAA’s representative.

  • Pingback: Current LHC Data and Supersymmetry; Is Supersymmetry in Trouble? | Of Particular Significance()

  • Lucy Haye

    Do you want to see a New Paradigm without FANTASIA? Please, go to;

    Lucy Haye Ph. D.
    SAA’s representative

  • Pingback: LHC lässt dem Higgs immer weniger Verstecke « Skyweek Zwei Punkt Null()

  • Pingback: Now for the science bit… « Slugger O'Toole()

  • Pingback: Why All the Focus on Supersymmetry? | Of Particular Significance()

  • Pingback: Higgs Boson, God Particle: LHC, Tevatron Physicists Eye Breakthrough in Months – International Business Times | My Blog()


Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Cosmic Variance

Random samplings from a universe of ideas.

See More

Collapse bottom bar