Synchronized time

By Sean Carroll | July 18, 2005 1:13 pm

Last week in Paris, I walked along the north-south line connecting the Observatoire de Paris to the Palais du Luxembourg. A line of longitude: in fact, the line of longitude, if the French had had their way a little over a century ago. A politico-scientific battle was being fought in the late nineteenth century over the location of the Prime Meridian. Parisians, thinking only of considerations of nature and philosophy, argued that the line of zero longitude should go through l’Observatoire; the rest of the world, crass materialists that they were, noted that over seventy percent of the world’s shipping was already using Greenwich (nine minutes and twenty-one seconds to the west of Paris) as its standard of longitude. The French lost out to the British, prefiguring a similarly heated tussle over who would host the Olympic Games over a hundred years later.

These issues figure prominently in the book I was reading during my trip, Peter Galison’s Einstein’s Clocks, Poincare’s Maps: Empires of Time. It is a paradigmatic example of a engaging work of intellectual history, as it has a definite theme that is at once simple, interesting, and true. Einstein and Poincare, the obscure German theoretical physicist and the celebrated French mathematician and philosopher, were pivotal figures in the development of the special theory of relativity, whose centenary we are celebrating this year. Relativity has a reputation as an esoteric theory, and Einstein and Poincare are often thought of as abstract thinkers divorced from mundane matters of technology and experimentation. Galison argues convincingly that these thinkers’ practical concerns with the measurement of time — Einstein judging clock designs at his patent office in Bern, Poincare as President of the Bureau of Longitude — were in fact crucial to their recognition of the need for a new understanding of the fundamental nature of time itself.

In a Newtonian universe, time is universal — the amount of time elapsed between two events is precisely and uniquely defined, even if the events are widely separated in space. It may be difficult to actually measure the time between events, and this task was a constant preoccupation of nineteenth-century astronomers, surveyors, politicians, and businessmen. It’s easy enough to use the sun to determine your local time, but the advent of railroads made it necessary (as several unfortunate accidents proved) to sensibly coordinate time among far-flung locales, a program that eventually led to our current system of time zones. In the course of standardizing time across broad expanses of geography, it became clear that synchronization was an operational concept — you had to bounce some signals back and forth between locations, and taking into account the travel time of the signals themselves was of primary importance. Poincare’s work on longitude was intimately connected to precisely this problem, as was Einstein’s experience with novel clock designs. (At one point subterranean Paris featured tubes that would carry pulses of compressed air from a central station to clocks throughout the city, which would use the pulses as reference standards to guarantee as precise a degree of synchronization as possible. Einstein would have seen numerous proposals for electrical versions of such schemes.)

By itself, the need to synchronize time via exchanged signals does not lead you to relativity; it is equally characteristic of Newtonian absolute time. But when combined with the principle of relativity and the invariance of the speed of light, this insight led Einstein to understand that the notion of simultaneity of distant events is not universal, but depends on one’s frame of reference. (In general relativity, in which spacetime is curved, we need to go even further — the notion of simultaneity is not simply frame-dependent, it is completely ill-defined.) Time goes from being an absolute characteristic of the universe to something individual and personal, a measure of the distance traversed by a particular object through spacetime. Poincare (following Hendrik Lorentz) had worked his way to similar conclusions, but it was Einstein who showed how to completely abandon the absolute Newtonian time that other physicists felt still lurked unobserved in the background.

Did someone say that scientists are individual idiosyncratic human beings? Gleaming mathematical edifices like the special theory of relativity can give the impression of having dropped from the sky; it’s nice to be reminded of the messy contingent ways that real people happen to stumble upon them.

CATEGORIZED UNDER: Science
  • http://afni.nimh.nih.gov/afni Robert the Red

    I’ve thought that Einstein must have taken seriously the following 2 advances of the late 19th century:

    - Maxwell’s equations, which lead to the Lorentz invariance and not Galilean invariance; and

    - the experimental revelations that gross matter is in fact electromagnetic inside (breaks into electrically charged pieces)

    So that Maxwell’s equations must be used to describe matter, so Lorentz invariance must be the way the world works, so the speed of light is invariant, et cetera. That is, by taking seriously these principles, then special relativity follows.

    Formulating the principles to be taken seriously is the key difficulty, of course. Then deciding how far to push them is the next decision.

  • http://pithingcontest.blogspot.com greensmile

    Its about time. Can you go back? Not classically. No according to Boltzmann, as I understand him. Relativity keeps giving me this Mona Lisa smile when I ask it…but I don’t speak the language. here is an article slashdot hasn’t got the physics chops to tackle. I am looking for a discussion of Amos Ori’s little foray into time travel. I someone has a sound opinion to the effect that the instability cannot me massaged out of the equations, I would be happy.

  • http://www.livejournal.com/~quantoken Quantoken

    Einstein’s SR is surely based on the invariance of light speed. But every one just accepts the invariance of C as a fact, an experimental fact, and take it for granted. No one gives it a little bit more thought why that is the case. If the whole universe is reconstructed from a set of basic physics laws, would it be able to construct one where every thing else equaled, but the light speed can be varied?

    My conclusion is you need to have a set of reliable rulers to make spacetime measurements. You need three rulers: hbar, C, and a fundamental length R. Therefore C is fixed by design because it is used as a ruler. Actually it plays into the actual physicla rulers: The length of the material that constitute a meter stick depends on the lattice constants, which is a multiple of the Bohr radius, and which is given by a formula relating hbar, C, mass of electron, and the alpha.

    Simutaneousity is more problematic. Our concept of time flow is closely related to causal relationships. And the causual relationship is probably just a logical thing created in our mind, not a physical reality. For example if you watch a movie, which is actually just a series of independent pictures played in certain sequence, you observer a bunch of causual events. But if you take a look at the movie film, one picture really has no causual relationship from another one. They may all well be made by computer programs, running at different spacetime point to eusure they can have no causual relationship between them. But once you play them in proper sequence you have the illusion of causuality.

    Very likely, the ultimate theory must abandon the concept of spacetime altogether, and construct everything from something more fundamental, i.e., quantum information. Then you construct the concept of spacetime from statistics. Therefore all observables are just statistics like temperature is, and beyond certain scales they no longer exist, and the same can be said about the theory that describes spacetime, i.e., GR, do not exist beyond a certain small scale. This is what I have been saying: GR can extend but only to where the nose of QM started, and go no further. That’s how you resolve the conflict between GR and QM.

    Quantoken

  • John B. Merryman

    I’m not a scientist. I just spend alot of time thinking about the little I know. This is an effort to make sense of this reality;

    The notion of time as a dimension is based on the irreversibility of macroscopic events, which creates our sense of past, present and future.

    A very simple and basic point has been overlooked here; While the arrow of time for the observer proceeds from past events to future ones, the arrow of time for these circumstances proceeds from being in the future, to being in the past.

    Einstein didn’t fully apply the basic principle of relativity to time. Relatively speaking, content and context go in opposite directions. To the hands of the clock, it is the face going counterclockwise.

    Reality consists of energy recording information. As the amount of energy remains the same, old information is erased as new is recorded. This information is a product of relationships of the manifest energy. As there is no absolute frame, any action is balanced by an “equal and opposite” reaction. Objective reality is the energy. Time is only a function of the subjective information.

    The unit of time goes from beginning to end, but the process of time is going toward the beginning of the next, leaving the old. A day is measured by the sun rising in the east and setting in the west, but the reality is the earth is rotating west to east. As our day fades, others are dawning.

    Think of a factory; The product moves from initiation to completion, but the production line faces the other way, with its mouth consuming raw materials and finished product being expelled. Life is the same. Our lives are units of time going from beginning to end, while the process of living goes on to the next generation, shedding the old like dead skin. Do we go through time, or does time go through us?

    The mind is a form of factory and the products are individual thoughts. Our senses continually take in information and out of this, we construct coherent conceptual units. Meanwhile the mind continues to absorb fresh information and as the particular thought matures, it factors in less of this additional information. At some point a new thought has taken shape and displaced the previous thought.

    Time then, isn’t a dimension because the frame of reference does not constitute an absolute against which the point of reference transcribes another dimension. It is a process in which the point and frame move relative to their respective influence on one another.

    A clock which would represent this process would have no face, but innumerable hands going both directions, at various speeds. The sum of this motion would be zero. Subtracting out any particular hand as a point of reference would leave the remaining hands with a net motion in the opposite direction. As we are part of that frame of reference, we only see the hand move.

    Time is not so much a projection out from the present event, as it is a coming together of factors to define what is present. The past is the influences which defined current order and the future is the sources of energy which will motivate that order. When the order of the past is an open set, it absorbs fresh energy, defining it, so the future is a continuation of the past. When the order is a closed set, the energy accumulates in open spaces and the future is a reaction to the past.

    Time is the fundamental manifestation of complexity theory, with the present as the phase transition between order(past) and chaos(future). Like a collapsing wave function, the future is pure circumstance, as a potentially infinite number of factors are involved, while the past is pure judgment, because every possible influence has been factored. Of course, in reality, the wave never actually collapses. It just keeps rolling along, with the medium effectively going the opposite direction.

    Time is a tensor method of measuring motion. Temperature is a scalar method of measuring motion.

    Consider statistical measures of the economy as a form of temperature reading, a general level of activity against a prevailing scale. Now if we were to follow an individual through the larger economy, or the activity of a particular atom in a fluid, it would be a tensor measure. The reason this form of measurement matters so much to us is that we are that individual.

    (The next part is in basic conflict with the basic paradigm of modern cosmology, the Big Bang Theory. While simplistic, it does tie into my overall premise, so I include it at the expense of potential disagreement.)

    This understanding of time originally grew out of a basic impression that the Big Bang Theory was wrong. It first occurred to me in reading Stephen Hawking’s A Brief History of Time, when he raised to point that Omega=1. In other words, for the universe to be as stable as it is the force of expansion must be in inverse proportion to gravitational collapse. If universal expansion is completely balanced by gravitational collapse, where do they come up with the additional expansion for the universe as a whole to be growing? It makes more sense as a convective process, with (hot) radiant energy expanding and (cold)mass collapsing. In this context, the expansion of lightwaves, which create redshift, are as fundamental to light itself, as gravity is to mass. These two impulses balance because they are different sides of the same process.

    Out of this, I’ve managed to construct a very simple model of universe, with few loose ends.

    If light is effectively expanding, but the universe isn’t, this would cause additional pressure on existing gravitational systems, resulting in the excess spin to galaxies that is currently ascribed to dark matter. It would also explain the Pioneer effect. In a sense then, if we think of gravity as curving space one way, then light curves it ever so slightly the other way. The effect over inter-galactic distances adds up.

    The concept of dark energy is proposed to explain why the redshift of distant sources is proportionally less then that of closer sources. As the BB model assumes the earliest expansion is due to the initial singularity, the question is what causes the additional effect. If we accept that this redshift is a property of light itself, the question then becomes the opposite; What reduces the redshift of the further sources? I think it is because this light goes through more intermediate residual gravity fields and the lensing effect amounts to a blueshift, which neutralizes some of the redshift of this light.

    The reason the cosmic microwave background radiation is smooth is because space can only hold a certain level of radiation in solution before it starts to condense out as basic forms of matter. So 2.7k is the phase transition.

    Galactic black holes are basically the eye of the storm and most of the activity is what we see, the collapsing mass and radiating energy.

    How this ties into my observations about time is that the arrow of time for content is collapsing mass, going from birth to death, while the energy radiating back out is context, continually expanding, both in the creation and growth of new matter and the radiating away of old matter. Remember that Einstein was compelled to add the cosmological constant because his theory of gravity had the universe collapsing to a point and he only had that one dimension/direction of time.

    I have a few other snide asides to Big Bang Theory, such as if space is created at the singularity, then it would take light as long to cross the initial universe as it does the present and so why does light not slow down as we peer back into the beginning? If light is constant, then space is constant and all these old galaxies which must have formed in the first few hundred million years would only have a few hundred million years to fly away from each other and should all be in a bunched at one point in the sky. Yes, I know, Inflation Theory. Universal Fudge Factor.

    As for quantum theory, this focus on the subatomic reality as a function of discrete particles results in confusion. If we were to look at it from the other direction, that process is fundamental and the particles are secondary, then we would be thinking of these particles as nodes in a network, rather then specific objects. Rather then photons as a little BB like object, think of them as a cross section of the the trunk of a lightning bolt, or tree and the reason they transmit specific quanta of energy has more to do with transition factors of the system, rather then the weight of a particular object. This would go a long way in explaining the connectivity issues.

    Waves, are not opposing concepts of matter from particles, but simply a recognition of the underlaying field from which they rise, so both wave and particles constitute units and it is the measuring process which is the opposing context. Strings are another attempt to define the unit as fundamental.

    The fact of the matter is that the scientific establishment is about as likely to reconsider its standard models as the Catholic Church is about to reconsider its dogma. Like any entity, it will eventually fall and the springtime of the succeeding era will flourish with fresh ideas. The fact is, though, this pattern applies across many aspects of established paradigms, from religion to politics and economics. As this convective cycle is also applicable to issues in all these fields, I think there is the potential to combine it into one larger worldview that could not be bottled up by any particular ruling establishment.

    Monotheistic religion contains logical fallacies which are at the base of many of our other problems. It is based on the assumption of the spiritual absolute as being at the intellectual apex. For one thing, the absolute is the equilibrium around which opposing elements are balanced, so if one is to propose a spiritual absolute, it would be the most elemental form of consciousness out of which we rise and to which we fall, not an omnipotent being from which we fell and seek to return. Good and bad are not some overarching metaphysical duel, but the binary code of the biological computer and the intellect is a bottom up process of distinction and judgment. While this mistake is understandable, given the age of the concepts involved and the political circumstances they evolved under, by assigning both source and direction to the apex, it creates a logical paradigm in which the political models that grow out of it invest all authority to those at the top, without the necessary testing and questioning a ground up model demands of its efforts. Science understands that reality is fundamentally bottom up and incidentally top down, but our current primary religious theory doesn’t take this reality into account. While we have managed to shed the monarchism that was most directly founded on this assumption of the ruler as closest to the rule of God, there are very strong elements in the current administration which are operating under the influence of this concept.

    I must add that I am not an atheist. Theism assumes consciousness creates order out of chaos, while atheism assumes order creates consciousness out of chaos. Order and chaos are descriptive states of logic and the relationship does not adequately explain consciousness, so I am left to assume it is some base property of which we are a magnification and consolidation. It is not a unit, per se, because as I explained in the description of time, all units are components of larger processes. One is a set. Oneness is a state.

    As every point in space is the center of its universe and a subjective point of reference is required to create any frame of reference, it should be noted that reality is objectively chaotic and only subjectively ordered. This rules out a fundamentally deterministic reality, as the puppet pulls back on the strings, giving meaning to whatever is on the other end. We are a factor.

    The current political balance between Republican and Democrat is a good example of the convective process; The republic is a unit and as such it is governed from the top down. Democracy is a process. It gains it legitimacy from the bottom up. Communism and capitalism are another. As Communism proposed the state and economy as one unit, it became necessary to govern it from the top down, contrary to Marx’s original assumption that the state would “wither away.” Capitalism proposes the economy as an ecosystem, a process in which individual corporate units rise and fall according to their vitality. In fact, as Russian communism fell due to its destruction of basic initiative, Chinese communism has managed to incorporate itself as a very effective corporate unit within the jungle of world capitalism.

    On both the political and religious fronts, there is a common mistaking of extremes for absolutes, as our linear thinking leads us to believe that if we push hard enough in any one direction we will achieve some final solution, rather then just building up momentum for the pendulum to swing ever further in the opposite direction, always around the point of balance which is the true absolute. If we can understand this cyclical reality, we might be able to use it to the advantage of life on this planet, not just be an intellectual victim of it.

    In that the economy is also a convective cycle, energy in the form of labor, materials and ideas rise up, while wealth, civil order and social security precipitate down. Supply side theory has created a situation where far more has been rising then is effectively used or precipitating down and the results are huge storm clouds of surplus wealth boiling over a parched economy. For reference, consider where the money the government borrows would go, if it were not being recycled through the public sector. We already have a situation of serious asset inflation and this money would just increase the effect. Government borrowing is effectively a nationalization of surplus wealth, but rather than actually taking it, the revenue stream of the government is being transferred to those with surplus wealth in the first place, which only adds to the problem.

    The fact that Social Security is a direct transfer is one of the primary reasons it is so efficient. Only as much money can be saved as can be invested and there is a dearth of investment vehicles in the current situation. It is a situation similar to the electric industry. As it would be prohibitively expensive to build the battery storage for the amounts in question, it has to be used as it is generated. Creating the investment vehicles necessary to store private accounts would be like storage batteries for the electric industry.

    I first started questioning economic pronouncements when trying to figure out how Paul Volcker cured inflation by raising interest rates. Yes, it is started by loose money, but reverse engineering in not always so simple. By raising interest rates, his solution for the oversupply of money was to raise the cost! Government borrowing is what brought inflation under control, after supply side economics squeezed it out of the general economy, the government skimmed it off the top and then spent it. As public spending supports private investment, rather then competing with it, the effect was compounded. The boom of the 80′s and 90′s had more to do with the baby boom going through its most productive years, than anything else.

    In 1996, Bob Dole had a campaign slogan, “We want you to keep more of your money in your pocket.” My first thought was, Well thank God it isn’t my money, or it would be worthless.” The logic behind this insight is that as a medium of exchange, money is actually a form of public commons, much like the highway system. Under our current ideology of individualism, we assume it is private property. To use the roads as an analogy, it would be as if every time a new road was built, everyone tried to claim as much as possible. The eventual result would be that everything would be paved over and no one would be able to get anywhere. We are close to reaching that situation with our monetary system, as every aspect of life is judged according to the bottom line and the economy is still about to seize up.

    Money and government are two sides of the same coin. One is rights, the other is responsibilities. Money is like processed sugar, so if we were to learn to maintain a more organic, wholistic society and maintain wealth and value within every aspect of our lives and not continually drain reductionistic units out to put in some bank, then government would be forced to organize itself along similar lines.

    Be it math, money, or god, the abstract is an approximation of reality, rather then reality being a manifestation of the abstract.

    This covers alot of disciplines, but it does so within the framework of a basic pattern.

    Regards,

    John B. Merryman Jr.
    Sparks, Md.

  • Pingback: Relative importance | Cosmic Variance

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Cosmic Variance

Random samplings from a universe of ideas.

About Sean Carroll

Sean Carroll is a Senior Research Associate in the Department of Physics at the California Institute of Technology. His research interests include theoretical aspects of cosmology, field theory, and gravitation. His most recent book is The Particle at the End of the Universe, about the Large Hadron Collider and the search for the Higgs boson. Here are some of his favorite blog posts, home page, and email: carroll [at] cosmicvariance.com .

ADVERTISEMENT

See More

ADVERTISEMENT
Collapse bottom bar
+

Login to your Account

X
E-mail address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it e-mailed to you.

Not Registered Yet?

Register now for FREE. Registration only takes a few minutes to complete. Register now »