From Eternity to Book Club: Chapter Two

By Sean Carroll | January 26, 2010 7:41 am

Welcome to this week’s installment of the From Eternity to Here book club. Today we look at Chapter Two, “The Heavy Hand of Entropy.”

[By the way: are we going too slowly? If there is overwhelming sentiment to move to two chapters per week, that would be no problem. But if sentiment is non-overwhelming, we’ll stick to the original plan.]

Excerpt:

While it’s true that the presence of the Earth beneath our feet picks out an “arrow of space” by distinguishing up from down, it’s pretty clear that this is a local, parochial phenomenon, rather than a reflection of the underlying laws of nature. We can easily imagine ourselves out in space where there is no preferred direction. But the underlying laws of nature do not pick out a preferred direction of time, any more than they pick out a preferred direction in space. If we confine our attention to very simple systems with just a few moving parts, whose motion reflects the basic laws of physics rather than our messy local conditions, there is no arrow of time—we can’t tell when a movie is being run backward…

The arrow of time, therefore, is not a feature of the underlying laws of physics, at least as far as we know. Rather, like the up/down orientation space picked out by the Earth, the preferred direction of time is also a consequence of features of our environment. In the case of time, it’s not that we live in the spatial vicinity of an influential object, it’s that we live in the temporal vicinity of an influential event: the birth of the universe. The beginning of our observable universe, the hot dense state known as the Big Bang, had a very low entropy. The influence of that event orients us in time, just as the presence of the Earth orients us in space.

This chapter serves an obvious purpose — it explains in basic terms the ideas of irreversibility, entropy, and the arrow of time. It’s a whirlwind overview of concepts that will be developed in greater detail in the rest of the book, especially in Part Three. As a consequence, there are a few statements that may seem like bald assertions that really deserve more careful justification — hopefully that justification will come later.

Here’s where I got to use those “incompatible arrows” stories I blogged about some time back (I, II, III, IV). The fact that the arrow of time is so strongly ingrained in the way we think about the world makes it an interesting target for fiction — what would happen if the arrow of time ran backwards? The straightforward answer, of course, is “absolutely nothing” — there is no prior notion of “backwards” or “forwards.” As long as there is an arrow of time that is consistent for everyone, things would appear normal to us; there is one direction of time we all remember, which we call “the past,” when the entropy was lower. It’s when different interacting subsystems of the universe have different arrows of time that things get interesting. So we look briefly at stories by Lewis Carroll, F. Scott Fitzgerald, and Martin Amis, all of which use that trick. (Does anyone know of a reversed-arrow story that predates Through the Looking Glass?) Of course these are all fantasies, because it can’t happen in the real world, but that’s part of the speculative fun.

Then we go into entropy and the Second Law, from Sadi Carnot and Rudolf Clausius to Ludwig Boltzmann, followed by some discussion of different manifestations of time’s arrow. All at lightning speed, I’m afraid — there’s a tremendous amount of fascinating history here that I don’t cover in anywhere near the detail it deserves. But the real point of the chapter isn’t to tell the historical stories, it’s to emphasize the ubiquity of the arrow of time. It’s not just about stirring eggs to make omelets — it has to do with metabolism and the structure of life, why we remember the past and not the future, and why we think we have free will. Man, someone should write a book about this stuff!

CATEGORIZED UNDER: Time, Words
  • Rob

    Sean:

    One of the three properties of time that you mentioned is that it orders events. At the quantum level, superposition of states implies an absence of ordering until the wave function collapses. Does this imply that time is an emergent phenomena? And, if we wind the universe back to the big bang singularity, could this imply that the low entropy at the beginning is a quantum property?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Rob, I don’t think it’s right to say that quantum mechanics implies that events aren’t ordered. It just becomes the case that the wave function is ordered: at one time it has a certain value, at the next moment it has some other value, and so on. And it certainly doesn’t imply a low-entropy beginning; there would be nothing incompatible about a high-entropy beginning and quantum mechanics.

  • Fenn

    I vote for 2 chapters a week.

  • Rex

    The Past Hypothesis that you mention at the end of chapter two is used to explain the relative order that we observe. But it seems like this raises as many questions as it answers…

    So we have our orderly observations and we want to explain them. To do this, we need some context to place our observations in. So we postulate the existence of an orderly external universe that “causes” our observations. But then we want to explain what caused this external universe…and the only option is to postulate the existence of a much larger multiverse. But then what explains the multiverse? A multi-multiverse?

    This leads to the need for an infinite series of ever larger contexts against which to explain the previous context that we used to explain the previous context that we used to explain the fact of our initial observations.

    It would seem that nothing can be explained in terms of only itself. To explain it, you have to place it in the context of something larger. Otherwise, no explanation is possible, and you just have to say, “this is the way it is because that’s the way it is.”

    Basically it seems to me that there’s only two way the process can end. Two possible answers to the question of “Why do I observe the things that I observe?”:

    1) Because things just are the way they are, and no further explanation possible.

    2) Because EVERYTHING happens, and so your observations were inevitable in this larger context of “everything”.

    Do you see some other option? Some flaw in the reasoning?

    In either case it seems like science doesn’t so much explain things as just describe (in a compressed and somewhat “lossy” format) what we observe. A subtle but significant difference I think. Do you see any significance in the difference?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Yes, I think there is a flaw. We need to explain the low entropy of the observed universe, because it seems very unnatural to us. But if we can find an explanation, e.g. in terms of a multiverse, it may very well be that there’s nothing at all unnatural about the evolution of that multiverse. Otherwise it wouldn’t be much of an explanation at all.

    The thing that needs to be explained is not “the universe exists,” but rather “our observable part of the universe begins in a very finely-tuned state.”

  • NicoleS

    I vote for keeping to 1 chapter a week. Some of you crazy smart people might just be breezing through this book, but I think there were a few time newbies in the group, and as one of them, I need the week to read, re-read, and process what I just read.

    But I won’t drop behind if the votes are overwhelmingly for 2.

  • paul valletta

    Would you say that Entropy had a specific moment it started, and what sort of particle density was present at this moment?

    Does Entropy have a “constant” speed, is there a varying speed of Entropy?.. if so will it’s speed of conversion be greater in the future?

    Is a Particle’s size relative to it’s contribution to Entropy?
    Sorry I have not been able to purchase your book just yet, but I am dropping the birthday present hints on a regular basis!

    best wishes pv.

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Paul– entropy can be defined at any time, so it didn’t really “start.” The speed at which it changes can be highly variable, depending on what’s going on. And no, a particle’s size has no direct connection to its entropy.

  • Philoponus

    The reversibility of our physical laws, you say, means that “if we knew the present state of every particle…we could deduce the future as well as the past.” (top p.43) I’m trying to figure out how we would DEDUCE the future. Suppose we slightly re-define possibilism as the view that the future MAY not exist. A possibilist could then ask, is there anything in our physical laws (+ the current state of the universe) that guarantees the universe will still exist at all in 3 minutes, or still exist with the same physical laws. To deduce the future, don’t we need this kind of guarantee?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Sure. If the laws of nature change in an unanticipated way (which “the universe ceases to exist” would certainly count as), you wouldn’t be able to predict the future. We are assuming that the laws are both known and reliable.

  • Nick

    Firstly. Thanks for the book Sean. I am greatly enjoying it.

    I have a question related to the first paragraph of page 41 (it actually starts on page 40).

    It seems to me that this paragraph is making a probabilistic argument that the low entropy boundary condition at the start of the universe is required to trust our memories and thus the arrow of time. In other words, if “the past” had a high entropy, then it is less probable that a memory of a particular event in the past is accurate, than it is that that memory (and perhaps the person/brain that experiences that memory) exists on it’s own as a statistical fluctuation away from that high entropy state.

    Firstly, could you please correct me if I have misunderstood this paragraph.

    Secondly, if I have understood this correctly, does this mean that if there were not a low entropy boundary condition in the past, then it would not really be possible for us to be confident that there is an arrow of time? Also, would this leave us trapped in a Boltzmann brain style paradox?

  • http://jacobrussellsbarkingdog.blogspot.com Jacob Russell

    I suppose for a physicist this is a quibble, as you know what you mean when you speak of the ‘laws’ of physics. That it’s an idiom, a quirk of usage, is not important as long as it doesn’t in itself bear the burden of explanation. As a writer and poet who happens to be fascinated by science, but very much an outsider… the usage bothers me. A lot.

    As a poet, linguistic and cultural associations do matter, and ‘law’ suggests agency. A most unfortunate overlap there with the Intelligent Design folks (who or what passed these so-called ‘laws?’) More than that, it suggests a fixity which in the popular mind, leads to misunderstanding of how scientific theory develops. What is accepted as constant and unchanging, under different conditions and assuming additional evidence, necessitates modification… (so what is that about? one might ask. Do ‘laws’ change? Are these laws really not laws at all? Is that what ‘relativity’ is all about? That everything is relative and there’s no ‘truth’ and science is just another religion?” )

    You see what I mean?

    The use of this metaphor, drawn from legal usage, carries with it a whole set of interconnected associations, which, if you are not scientifically trained–or aware of scientific usage–is fraught with misunderstandings.

    Out of concern for the tremendous importance of basic understanding of science for the general public, I think this deserves some attention–a clear, philosophically tuned explanation how this word is used and what it does and does not mean. Maybe even a new word.

    Thank you for taking the time, and taking seriously the task of making science at the deepest level, accessible to the general public. There is no underestimating the importance of what you are doing here on Cosmic Variance–all of you who contribute to this forum are my personal heroes!

  • inDistinctMicrostate

    Sean ? really ?

    ” …we can’t tell when a movie is being run backward…
    …”

    To your credit, at least you said ” … The arrow of time, therefore, is not a feature of the underlying laws of physics, at least as far as we know …”

    Of course the arrow of time is due to underlying physics, as overwhelming casual observations suggest, you just haven’t identified how yet.

    When will physics realize that the whole Time Symmetry argument is fallacious ? As i understand it, the time symmetry notion can be argued by noting that the Feynnan diagram say, a pair creation:

    photon p- + p+

    … is indistinguishably correct when read from left to right (forward in time) or right to left ( ALLEGEDLY, the backward in time direction).

    Time symmetry (bi-directionality) is easily shown to be fallacious because if we were to observe either of these half reactions in isolation, we would say that time is always proceeding in its forward direction. A pair annihilation ( right to left in above) when seen in isolation, IS occuring in the forward direction. The naive error here is simply to realize that when reading the above Feynman diagram right to left is abolutely equivalent to reading it upside down, left to right, in a forward time direction.

    The unmeasured wavefunction is always in our past, a measurement is always in our ‘now’ and the wavefunction with its increased entropy is in our future. The entropy inexorably increases because the wavefunction ‘remembers’ or ‘incorporates’ the interaction in its interactions history and the new wavefunction is now the set of all allowed states that a new measurement might reveal, consistent with all previous interations the system has been involved in.

    So, it is obvious that the arrows of time and entropy are driven and sustained by Conservation of information( interactions history). By the way, since NO system can avoid ALL interactions, immortality is impossible in principle and that is the quantum mechanical basis of ageing. Ageing is simply every system’s tendency to become progressively indistinguishable from those systems it interacts with ( its environment ).

    You collect the Nobel, i’ll take the money, lol.

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Nick– Everything you said is correct. We’ll cover this ground much more thoroughly in Part Three.

    Jacob– What can I say except that it’s too late? “Law” is very deeply ingrained in scientific vocabulary by now. It is certainly not meant to imply any notion of agency; almost the opposite, in fact. It’s also not used very precisely — not because the usage is sloppy, but because there is no precise definition.

    inDistinctMicrostate– The microscopic laws of physics, as we currently understand them, are perfectly reversible. We talk about this in great detail in chapter 7. The collapse of the wave function is the one possible exception, as we’ll discuss in chapter 11. Irreversibility comes from the state, not from the laws.

  • inDistinctMicrostate

    ( sorry if any duplication – Saving an edit failed)

    Sean ? really ?

    ” …we can’t tell when a movie is being run backward…
    …”

    To your credit, at least you said ” … The arrow of time, therefore, is not a feature of the underlying laws of physics, at least as far as we know …”

    Of course the arrow of time is due to underlying physics, as overwhelming observations suggest, you just haven’t identified how yet.

    When will physics realize that the whole Time Symmetry argument is fallacious ? As i understand it, the time symmetry notion can be argued by noting that the Feynnan diagram of say, a pair creation:

    photon e- + e+

    … is indistinguishably correct when read from left to right (forward in time) or right to left ( ALLEGEDLY, the backward in time direction).

    Time symmetry (bi-directionality) is easily shown to be fallacious because if we were to observe either of these half reactions in isolation, we would say that time is always proceeding in its forward direction. A pair annihilation ( right to left in above) when seen in isolation, IS occuring in the forward direction. The naive error here is simply to realize that when reading the above Feynman diagram right to left is abolutely equivalent to reading it upside down, left to right, in a forward time direction.

    The unmeasured wavefunction is ( in a sense) always in our past, a measurement is always in our ‘now’ and the wavefunction with its increased entropy is in our future. The entropy inexorably increases because the wavefunction ‘remembers’ or ‘incorporates’ or ‘records’ the interaction in its interactions history and the new wavefunction is now the set of all allowed states that a new measurement might reveal, consistent with all previous interations the system has been involved in.

    An umeasured wavefunction is in a minimum entropysymmetric state ( since all observers agree on the represented set of allowed outcomes of any next measurement) and a measurement is a symmetry breaking event of zero duration(say) and the entropy of the wavefunction is in a new minimum ( but higher than previous) entropy state post interaction (measurement).

    So, it is obvious that the arrows of time and entropy are driven and sustained by Conservation of information( interactions history). By the way, since NO system can avoid ALL interactions, immortality is impossible in principle and that is the quantum mechanical basis of ageing. Ageing is simply every system’s tendency to become progressively indistinguishable from those systems it interacts with ( its environment ).

    You collect the Nobel, i’ll take the money, lol.

  • Lawrence Kuklinski

    Hello,

    Thank you for the great book. Your thoughts keep a mind fresh!
    How does entropy affect vacuum energy?

    Larry Kuklinski

    PS: One Chapter per week is my vote. I have the time!
    For your coffee quest consider:
    http://www.starbucks.com/flash/sirena/default.htm

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Lawrence– As far as I know, entropy doesn’t affect vacuum energy, it’s the other way around. As we’ll discuss later, the amount of vacuum energy determines the amount of entropy you can fit within an observable patch of the universe.

  • Tom Allen

    Hi Sean,
    First, thanks for the great read. Couldn’t put it down and have plowed all the way through (including all the end notes). I’ve enjoyed your style and impressed by all your historical research.

    I am content to re-read a chapter a week to stay with the discussion (and to prolong this enjoyable exercise as long as possible).

    I think your proximity model (earth/updown; big bang/futurepast) elucidates your point quite well. The up/down distinction is mediated by GR where the earth curves space-time and we behave as the “law” requires in curved space-time. So the mechanism by which the future/past distinction is enforce is purely statistical in nature? Sounds like the objections Boltzmann encountered.

  • Lawrence Kuklinski

    It seems the Dark side is the flip side of the observable universe.
    A rollercoaster up then down to the dark tunnel.

    lmk

  • http://www.math.brown.edu/~lubinj Jonathan Lubin

    Great achievement, this book, but I doubt that I’ll be able to keep up with the club, since I have several others open now. But I found my first error in it (gloat): P. 39, line 2, should be “describing a disk about half a degree across.” But I repeat: the book’s terrific.

  • marc

    Sean–this may be a silly question, and I haven’t really thought about it. But you say that the microscopic laws of physics are reversible. But in kaon and b meson decays, CP (and thus T) is violated. What does “reversible” mean in the context of C,P and T? I would guess that it might mean CPT invariance, not T, but what if CPT is violated at some level?

    Upon further thought, it seems to me that CPT invariance is sufficient to establish reversibility. But there are lots of models that violate CPT (albeit often Planck-scale suppressed). Would an observation of CPT violation change your basic thesis?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Tom– Yes, the past/future distinction is statistical. But the statistics are really good! Boltzmann did run into this problem, but also much bigger problems. Very impressive that you read all the notes!

    Jonathan– Good catch. If that’s the biggest error in the book, it will be miraculous.

    Marc– Not silly. “Reversible” really means “unitary” or “information-preserving” — given the state now, we can reliably construct a unique state at any fixed time in the past. Related to time-reversal invariance, as we’ll discuss to death in Chapter 7, but not quite the same. If laws are reversible, you can always construct some sort of souped-up notion of time-reversal (like CPT) that will be conserved. I suspect the converse is true, but don’t know a theorem off the top of my head.

    All my basic points rely on reversibility, but not directly on time-reversal invariance, so CPT violation wouldn’t matter too much as long as reversibility were maintained. (And if it isn’t, see Chapter 15 — I still don’t think it helps.)

  • http://teenageelephant.blogspot.com Sam Wolk

    If the boundary for the observable universe is the Big Bang and the theory of this book is that there was something else “before” it/ there is a multiverse etc., how can we know about what happened “before” if it is not observable? It seems like the answer to that would be that what happened “before” has an effect (dark energy, if my understanding is correct, isn’t observable but we can observe its effect – the accelerating expansion of the universe). If something isn’t observable and has no effect, than its existence doesn’t really matter right? So my question is (and perhaps you reach it later in the book) what are the effects of the multiverse/the other theories presented in the book that lead us to theorizing their existence? Is it maybe that because of what happened “before” or the multiverse etc. makes everything that we DO observe the way it is? I guess the rest of the book might be taking what effects we do observe and trying to prescribe causes to them (i.e. the multiverse etc.) so sorry if this question is what you try to explain in the rest of the book.

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Sam– The idea would be that the observable impact is the low-entropy initial configuration of our universe. To gather confidence that this is the right explanation, of course, requires a great deal better understanding of what the underlying theory is and what it predicts. I talk about this issue a bit in the epilogue.

  • http://lablemminglounge.blogspot.com/ Lab Lemming

    Sean, why do you describe the entropy of the Universe as ‘low’? Most observable (non-dark) matter is gas of some sort, and is thus has much higher entropy than if it had all condensed out into a crystalline solid.

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    That would be true if gravity weren’t important, but it is. The smooth configuration of the early universe is very unnatural and low-entropy; a wildly inhomogeneous mess would be higher entropy.

  • J

    Sean,

    First my vote – one chapter/week.

    Now a question. I’m trying to get my head around the idea of remembering the future and not the past in a decreasing-entropy universe.

    Let’s say I’m measuring the interval between events by the B- decay of some radioactive pile. Event A happens, 200 detector clicks happen, event B happens, in that order. Even if I lived in a universe where the pile was going to absorb electrons and anti-neutrinos (or would it be positrons and neutrinos? damn it gets complicated), wouldn’t I still record the events as “A happened, 200 clicks, B happened? That is to say, wouldn’t I still perceive B as occurring after A.

    By the way I finished the book, great read from beginning to end.

  • HG

    I vote for one chapter per week.

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    J– You can always measure the time between two events. But if there is no entropy gradient, and therefore no arrow of time, there would be no reason to claim that one event is “before” and the other “after” — all you would say is that they occurred a certain time apart.

  • Jason A.

    “The smooth configuration of the early universe is very unnatural and low-entropy; a wildly inhomogeneous mess would be higher entropy.”

    I thought it was the other way around. A homogeneous mixture is high entropy (gas particles evenly distributed around the room, the most statistically likely distribution). Inhomogeneous would be low entropy (all the gas particles in one corner).

    What do you think of the idea that the universe began in a high entropy state, but as it expanded the ‘maximum allowed entropy’ increased, so that we get something like increased ‘room for entropy’. Then we allow the universe to increase in entropy over time without having to postulate it was in some unlikely low entropy state in the past. Picture it like gas particles spread evenly around the room, but we increase the size of the room faster than the gas can expand to keep up so we end up with all the gas in one corner. I don’t know if I’m wording that well but you’ve probably heard this idea before.

  • J

    “all you would say is that they occurred a certain time apart”

    Ok thanks, I now have the part about if I look at the complete recording, all I can say is that 200 clicks occurred between event A and event B. But what if I sneak a peek at 150 clicks? What do I see then? Am I able to differentiate when I look again at 200 clicks from what the condition was at 100 and 150 and what is it now?

  • http://www.7duniverse.com Samuel A. (Sam) Cox

    “In the case of time, it’s not that we live in the spatial vicinity of an influential object, it’s that we live in the temporal vicinity of an influential event: the birth of the universe.”

    Just incredible Sean! So important, so basic- and so profound.

    I’m about 2/3 of the way through the book, reading it slowly- and enjoying it immensely. This is a very interesting conceptual work, tied to the history of science…very objective. You carefully note key loopholes and logical lapses.

    Thanks for taking the time….

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Jason– You’re neglecting gravity. With gravity, a dense fluid with high entropy becomes very inhomogeneous; the early universe was very low entropy. There can’t be “increased room for entropy,” because the universe is an isolated system with a fixed set of states. (Or so we are assuming.) Detailed discussion in Chapter 13.

    J– Time still passes, even if there is no arrow. You can measure the time, no problem; you just can’t say that one direction is “before” and the other “after.” Some particular time coordinate is no better than a reversed one.

  • J

    “Time still passes, even if there is no arrow”

    Gaaa… I’m obviously not asking the question in the right way (if I’m so far off the mark that the space required to answer is a course and not an answer, I won’t be offended if you tell me)

    But, damn it, “if time passes” it has to pass something doesn’t it? I fully understand that can I flip a space-time diagram and come up with the same answer. That makes sense. What doesn’t make sense to me is that just because I can flip a diagram that it reflects what really happens. We all throw out irrational roots when they fail to make a prediction that works and accept irrational roots when they make predictions that do work. Based on my interpetation though, if entropy is decreasing, before should be after and after should be before, not that we should all agree that we just can’t tell (if that were the case I’d have no problem understanding).

    Maybe it’s an “ideal observer” philosophical question. I really don’t care as long as the testable predictions work. Then again, based on your hypothesis, there wouldn’t be science in a decreasing-entropy universe. We would know the future. And maybe, that’s just why I can’t get my head around it.

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    I probably shouldn’t have said “time passes” — “time still exists and can be measured” would have been better. Without the arrow defined by increasing entropy, time is like space — you can measure the distance between two points, but there’s no universal notion of one being “first” and the other being “last.”

    That’s if there’s no arrow at all, corresponding to constant entropy. If entropy is decreasing everywhere in the same direction, we would always call the direction of increasing entropy “after.” Science would be perfectly okay, because before and after (and past and future) are defined by the direction of entropy increasing, not measure relative to it.

  • Clifford

    Is there an absolute measure of order? It seems to me that to measure entropy we have to always invent some arbitrary metric of orderliness which applies to the system at hand. Perhaps this is addressed in a later chapter?

    With both Lost and Caprica starting up, I vote to stick to the original plan of 1 chapter per week.

  • Jolyon Bloomfield

    Hi Sean (et al),

    Three points here, mostly tangentially related. Is Entropy a localised variable (ie, scalar field), or is it just a property of the system (such as “total charge”, or something like the ADM mass)? If we go to Special Relativity, how does Entropy transform under the various types of Poincare transformations available? Finally, you mention the entropy of gravity. I’ve come across this concept in a couple of conference talks, but from memory, both simply stated that “it’s hard to define”. Can you expand on this a bit?

    Thanks =)

    P.S. My vote’s for one chapter a week
    P.P.S. Say hi to Eanna if you see him =)

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Clifford– There is some arbitrariness, but it’s not completely arbitrary. Certain qualities are immediately macroscopically observable (temperature, pressure), while others are not. More in Chapter 8.

    Jolyon– Entropy isn’t generally localized, although there are some special circumstances when you can define an “entropy density.” Because it’s not local, it doesn’t transform as a tensor field; in flat spacetime you can define a total entropy associated with a spacelike hypersurface, and that would transform as the time component of a four-vector. (Much like the momentum 4-vector.)

    And yes, we don’t have a general understanding of the space of states when gravity is involved, so we don’t have a reliable formula for entropy. But there are special cases we do understand, like black holes and empty space. See Chapter 11.

  • Joe Shimpfky

    Hi Sean,

    I had some catching up to do as I just found your book over the weekend. I’m really enjoying it, thanks!

    I found the idea that the early universe was extremely low in entropy especially interesting. Now I was a humanities guy, so my knowledge of this stuff comes only from books on popular science. I’ve not seen other books that highlight the low entropy as one of the early universe’s qualities. I’m not sure why, the idea seems on its face to be a deep and rich one!

    Now I tend to associate low entropy with order and structure, and high entropy with disorder and randomness. So seeing that connection highlighted, the vision that comes to mind is of an early universe that displayed elaborate structure. Am I mistaken on that?

    Best wishes!

  • Susan

    Your examples are great, but I am doing my own to learn better. So I have a sun heated rock in a bowl of cold water on my kitchen counter. If I leave it alone, later everything will be at more or less the same temperature. If I measure at intervals, and do the math, the numbers go up, and can’t go down, and entropy goes up. So now I decide I don’t like how this is going at all. So I put the rock in my gas oven, microwave the bowl of water,turn up the room heat, get out the hair dryer, and start again. Still the entropy will go up overall. The energy from the burning gas, micromaves and electricity all have to be considered. Plus I am burning a lot of calories waving the hair dryer and moving things around. ( I wish! ). Entropy is going up all the time, up until I throw the rock back out in the yard, and let it effect the entropy out there. Now the temperature, measuring and math are probably the 2nd law. The system has had a object added to it and is definitely becoming more disorderly. Both ideas are increasing entropy, so it counts to the same thing, and I can move on to chapter 3. ( If this isn’t mostly right, I will be happy to read 2 over ). NOW it seems to me that the first rock and bowl group is the “closed system”, and the larger messy group should be the “open system”, but that doesn’t seem to say that , as I am reading it. So for clairity, which system is open and which is closed?

  • Tim van Beek

    Hi there,
    just discovered this thread and did not catch up yet, so: Sorry if I repeat something that has already been said.
    Right now I’m reading the “spacetime” chapter (p.74), but I found already enough sentences that I would like to see in every textbook on GR or thermodynamics, maybe we could collect them on a web page dedicated to this purpose?
    Example: p.50, “The correct deduction is not that general relativity predicts a singularity, but that general relativity predicts that the universe evolves into a configuration where general relativity itself breaks down”.
    Of course we can accept that GR hints at something like black holes, and that there is convincing – indirect – evidence for the existence of entities like black holes from cosmological observation, and that it is a valid topic to discuss – but that is often confused with the statement that GR “predicts” black wholes.

    Now to a question: Sean, you mention that the concept of free will is connected to the arrow of time. Right now I do not know how to make sense of that. Is it somehow connected to this line of thought: “I have a free will, this is not a contradictory to the existence of an omniscient entity. If you offer me different kinds of ice-cream, I have the free will to choose chocolat or vanilla. After I chose, you know my choice, because now it is in your past. That is not a contradiction to my ability to choose freely. If a beeing exists to whom everything in my future lies in it’s past, it would already know of all my choices, despite the fact that I am, was and will be free to choose.”
    Is that somehow connected to what you have in mind? Is it explained in more detail in later chapters of your book?

  • kim

    I vote for 1 chapter a week.

  • rww

    I’m deeply confused about why a smooth distribution of particles in the early universe is low entropy but a smooth distribution in the late universe is high. It doesn’t seem right that the difference is the distance between the particles; that just boils down to how long it will take gravity or chance to affect the distribution, no? And if it is a matter of “how far” and “how long” doesn’t that assume some preferred scale?

    As a poster said above, I won’t be offended to learn that I have missed the whole point. I’m once through the book (obviously too quickly) and look forward to reading it a second time in step with the blog.

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Joe– It’s misleading to associate “low entropy” with “structure.” It’s true that we often associate low entropy with order and high entropy with disorder, but that’s a casual gloss that doesn’t stand up to closer scrutiny. Low entropy just means unlikely, even if the configuration is extremely simple — like all the air in a room squeezed into a single cubic centimeter. The early universe is in a very simple and structureless configuration, even if it is low entropy.

    Susan– I think you’re on the right track. There is no universal choice of “open system” and “closed system”; a closed system is just one that is isolated from the outside world, so we can always turn a closed system into an open system by bringing it into contact with something else.

    Tim– Let’s keep the Chapter 3 stuff for next week. And yes, the free will stuff is explained a bit more later, especially in Chapter 9. But the basic point is simple: according to the underlying laws of physics, the past and future are determined by the present. But we don’t know enough about the present to actually do a very precise prediction or retrodiction. For the past, however, we also have access to a low-entropy initial condition, which greatly restricts the space of possible things that could have happened. In the future there is no such boundary condition, so things are much more wide open; that’s what gives us the feeling that the past is settled while the future is still to be decided.

    rww– It’s not that a smooth distribution is high-entropy in the late universe, it’s that a smooth distribution is high-entropy when gravity can be neglected. That’s certainly not the case in the early universe. Think of it this way: if a universe like ours were to contract rather than expand, we would not expect it to smooth out along the way. It would get lumpier as it contracted, entropy increasing all along the way. It’s only once we get to the very late universe, when everything has fallen into black holes which then begin to evaporate away, that the universe smooths out again.

  • Susan

    I found my question above was from chapter one, oops. I did more investigating since I posted it, and found several places that said the universe is the only truly closed system. Maybe by the end of the book I will be thinking that this idea might not be such a sure statment, we shall see. Thanks.

  • J

    “because before and after (and past and future) are defined by the direction of entropy increasing, not measure relative to it.”

    Defined. Now it makes sense to me. Thanks! I really found that to be the toughest concept in the whole book.

  • rww

    re #44: Is it then the durability of the smoothness despite the influence of gravity that marks it as low entropy in the early universe?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    rww– Not really; it’s the smoothness itself, not its durability. The Second Law says that entropy increases, but doesn’t tell us how fast; the rate of increase is a complicated thing that depends on circumstances. What marks the smooth early universe as low-entropy is its instability. A smooth configuration will become non-smooth under the influence of gravity, while a non-smooth configuration isn’t going to smooth itself out.

  • rww

    Thanks Sean, that did it.

  • CW

    A chapter a week is fine with me, but if there are any shorter chapters coming up or consecutive chapters that have a lot dependency on each other – it might be beneficial to discuss both at once? Maybe you can look ahead and give us a few days notice if this occurs? Right now, I’m reading two books – and I am pacing myself with the book club because I want to see if I can adapt to it. But, the book club’s Q&A is sort of beckoning to me, making me want to start plowing into the book some more, right now!

    Entropy is sort of challenging to comprehend. I get those “ah-ha” moments and “wait, what?” moments on occasion. As we get further into the book, there may be times that you have to refer us back to the first few chapters or get all Entropy 101 up in here, in the club’s Q&A.

  • http://www.pipelin.com/~lenornst/index.html Leonard Ornstein

    Sean:

    In 48., how does gravity causes a ‘small’, relatively-homogeneous, expanding, PRE-INFLATION, low-entropy universe to become “non-smooth”?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    Leonard– Gravity pulls things together, causing slightly overdense regions to become more overdense. Of course inflation can dissipate such overdensities, but the conditions required to start inflation are extremely low entropy to begin with — we’ll discuss in detail in Chapter 14.

  • http://andreas.com Andreas

    Try this: You can see the scale of things, from quarks to galaxies. Use the right-left arrow keys to navigate. http://www.newgrounds.com/portal/view/525347

  • Susan

    I have been peeking ahead a bit , and chapter three is very interesting, and four looks like a WOW. I have been trying to figure out what it is about your teaching ( book and lectures ) that I like so much. I think it is the way you treat the very simple ideas and the complex idea all with the same weight. All are facts that related to the subject. I makes whatever the amount of learning you do count. That is nice.

    I am understanding the book quite well. Still, when I finish it, read some other related materials, wait for awhile then read it again, I will understand more.

    My friends amazed that I can ask a question about the book, and get answer back from someone with such mind boggling education and knowledge.

    They are impressed when I give the a sort of “word of the day”, which I try to put into some sentence, in a casual way. Words like dynamical, retrodiction etc. These are smart poeple, nurses mostly. Most have very little interest in science, and see learning about the universe etc. as too hard. But I have helped a few to be more interested.

    Are there plans for a paperback ?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    There is going to be a paperback, yes.

  • rww

    Pp 31-32 you have this nice thought about directions in space being attributable to our proximity to a gravitating body and our direction in time to “living in the temporal vicinity of ” the big bang.

    Doesn’t that suggest that time itself , like space itself, actually has no direction?

    Or maybe, more generally,does it serve any purpose to treat entropy and time as distinct concepts?

  • rww

    I’m guessing that’s the whole point — that time has no intrinsic direction. Better late than never.

  • Corey

    Given that entropy is a measure of disorder, it seems counter-intuitive in my every day world that entropy always increases. I’m not disputing the point at all; I just found this interesting. Certainly the key is to think of the entire system – not just the low entropy intact egg, but also all of the energy and matter used to produce the egg. The egg itself, it seems, would be lower entropy than the previous state of the matter from which it was formed. Is this line of thought correct?

    I also found myself thinking hard about the statement that there are more ways for a system to become higher entropy than there are for it to become lower entropy. This may also be true, but a system cannot evolve in all the ways that are possible; the laws of physics must be followed. A few billiard balls bouncing off each other in space will not become higher entropy as a function of time. What is the boundary condition for complexity before a system must increase in entropy? What is inherently different in a complex system? The only thing I can think of proposing is the fact that the position and/or momentum of objects can be expressed as probability waves and only when that uncertainty can influence the evolution of the system does it matter that there are more ways a system can become higher entropy than lower entropy.

    Chiming in at the end of the Chapter 2 week, I clearly prefer just one chapter per week.

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Cosmic Variance

Random samplings from a universe of ideas.

About Sean Carroll

Sean Carroll is a Senior Research Associate in the Department of Physics at the California Institute of Technology. His research interests include theoretical aspects of cosmology, field theory, and gravitation. His most recent book is The Particle at the End of the Universe, about the Large Hadron Collider and the search for the Higgs boson. Here are some of his favorite blog posts, home page, and email: carroll [at] cosmicvariance.com .

ADVERTISEMENT

See More

ADVERTISEMENT
Collapse bottom bar
+

Login to your Account

X
E-mail address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it e-mailed to you.

Not Registered Yet?

Register now for FREE. Registration only takes a few minutes to complete. Register now »