Guest Post: Tom Banks on Probability and Quantum Mechanics

By Sean Carroll | November 16, 2011 3:03 pm

The lure of blogging is strong. Having guest-posted about problems with eternal inflation, Tom Banks couldn’t resist coming back for more punishment. Here he tackles a venerable problem: the interpretation of quantum mechanics. Tom argues that the measurement problem in QM becomes a lot easier to understand once we appreciate that even classical mechanics allows for non-commuting observables. In that sense, quantum mechanics is “inevitable”; it’s actually classical physics that is somewhat unusual. If we just take QM seriously as a theory that predicts the probability of different measurement outcomes, all is well.

Tom’s last post was “technical” in the sense that it dug deeply into speculative ideas at the cutting edge of research. This one is technical in a different sense: the concepts are presented at a level that second-year undergraduate physics majors should have no trouble following, but there are explicit equations that might make it rough going for anyone without at least that much background. The translation from LaTeX to WordPress is a bit kludgy; here is a more elegant-looking pdf version if you’d prefer to read that.

—————————————-

Rabbi Eliezer ben Yaakov of Nahariya said in the 6th century, “He who has not said three things to his students, has not conveyed the true essence of quantum mechanics. And these are Probability, Intrinsic Probability, and Peculiar Probability”.

Probability first entered the teachings of men through the work of that dissolute gambler Pascal, who was willing to make a bet on his salvation. It was a way of quantifying our risk of uncertainty. Implicit in Pascal’s thinking, and all who came after him was the idea that there was a certainty, even a predictability, but that we fallible humans may not always have enough data to make the correct predictions. This implicit assumption is completely unnecessary and the mathematical theory of probability makes use of it only through one crucial assumption, which turns out to be wrong in principle but right in practice for many actual events in the real world.

For simplicity, assume that there are only a finite number of things that one can measure, in order to avoid too much math. List the possible measurements as a sequence

$latex A = left( begin{array}{ccc} a_1 & ldots & a_Nend{array} right). $
The aN are the quantities being measured and each could have a finite number of values. Then a probability distribution assigns a number P(A) between zero and one to each possible outcome. The sum of the numbers has to add up to one. The so called frequentist interpretation of these numbers is that if we did the same measurement a large number of times, then the fraction of times or frequency with which we’d find a particular result would approach the probability of that result in the limit of an infinite number of trials. It is mathematically rigorous, but only a fantasy in the real world, where we have no idea whether we have an infinite amount of time to do the experiments. The other interpretation, often called Bayesian, is that probability gives a best guess at what the answer will be in any given trial. It tells you how to bet. This is how the concept is used by most working scientists. You do a few experiments and see how the finite distribution of results compares to the probabilities, and then assign a confidence level to the conclusion that a particular theory of the data is correct. Even in flipping a completely fair coin, it’s possible to get a million heads in a row. If that happens, you’re pretty sure the coin is weighted but you can’t know for sure.

Physical theories are often couched in the form of equations for the time evolution of the probability distribution, even in classical physics. One introduces “random forces” into Newton’s equations to “approximate the effect of the deterministic motion of parts of the system we don’t observe”. The classic example is the Brownian motion of particles we see under the microscopic, where we think of the random forces in the equations as coming from collisions with the atoms in the fluid in which the particles are suspended. However, there’s no a priori reason why these equations couldn’t be the fundamental laws of nature. Determinism is a philosophical stance, an hypothesis about the way the world works, which has to be subjected to experiment just like anything else. Anyone who’s listened to a geiger counter will recognize that the microscopic process of decay of radioactive nuclei doesn’t seem very deterministic.

The place where the deterministic hypothesis and the laws of classical logic are put into the theory of probability is through the rule for combining probabilities of independent alternatives. A classic example is shooting particles through a pair of slits. One says, “the particle had to go through slit A or slit B and the probabilities are independent of each other, so,

$latex P(A {rm or} B ) = P(A) + P(B)”.$
It seems so obvious, but it’s wrong, as we’ll see below. The probability sum rule, as the previous equation is called, allows us to define conditional probabilities. This is best understood through the example of hurricane Katrina. The equations used by weather forecasters are probabilistic in nature. Long before Katrina made landfall, they predicted a probability that it would hit either New Orleans or Galveston. These are, more or less, mutually exclusive alternatives. Because these weather probabilities, at least approximately, obey the sum rule, we can conclude that the prediction for what happens after we make the observation of people suffering in the Superdome, doesn’t depend on the fact that Katrina could have hit Galveston. That is, that observation allows us to set the probability that it could have hit Galveston to zero, and re-scale all other probabilities by a common factor so that the probability of hitting New Orleans was one.

Note that if we think of the probability function P(x,t) for the hurricane to hit a point x and time t to be a physical field, then this procedure seems non-local or a-causal. The field changes instantaneously to zero at Galveston as soon as we make a measurement in New Orleans. Furthermore, our procedure “violates the weather equations”. Weather evolution seems to have two kinds of dynamics. The deterministic, local, evolution of P(x,t) given by the equation, and the causality violating projection of the probability of Galveston to zero and rescaling of the probability of New Orleans to one, which is mysteriously caused by the measurement process. Recognizing P to be a probability, rather than a physical field, shows that these objections are silly.

Nothing in this discussion depends on whether we assume the weather equations are the fundamental laws of physics of an intrinsically uncertain world, or come from neglecting certain unmeasured degrees of freedom in a completely deterministic system.

The essence of QM is that it forces us to take an intrinsically probabilistic view of the world, and that it does so by discovering an unavoidable probability theory underlying the mathematics of classical logic. In order to describe this in the simplest possible way, I want to follow Feynman and ask you to think about a single ammonia molecule, NH3. A classical picture of this molecule is a pyramid with the nitrogen at the apex and the three hydrogens forming an equilateral triangle at the base. Let’s imagine a situation in which the only relevant measurement we could make was whether the pyramid was pointing up or down along the z axis. We can ask one question Q, “Is the pyramid pointing up?” and the molecule has two states in which the answer is either yes or no. Following Boole, we can assign these two states the numerical values 1 and 0 for Q, and then the “contrary question” 1 − Q has the opposite truth values. Boole showed that all of the rules of classical logic could be encoded in an algebra of independent questions, satisfying

$latex Q_i Q_j = delta_{ij} Q_j ,$
where the Kronecker symbol δij = 1 if i = j and 0 otherwise. i,j run from 1 to N, the number of independent questions. We also have ∑Qi = 1, meaning that one and only one of the questions has the answer yes in any state of the system. Our ammonia molecule has only two independent questions, Q and 1 − Q. Let me also define sz = 2Q − 1 = ±1, in the two different states. Computer aficionadas will recognize our two question system as a bit.

We can relate this discussion of logic to our discussion of probability of measurements by introducing observables A = ∑ai Qi , where the ai are real numbers, specifying the value of some measurable quantity in the state where only Qi has the answer yes. A probability distribution is then just a special case ρ = ∑pi Qi, where pi is non-negative for each i and ∑pi = 1.

Restricting attention to our ammonia molecule, we denote the two states as | ±z 〉 and summarize the algebra of questions by the equation

$latex s_z | pm_z rangle = pm | pm_z rangle .$
We say that ” the operator sz acting on the states | ±z 〉 just multiplies them by (the appropriate ) number”. Similarly, if A = a+ Q + a (1 − Q) then

$latex A | pm_z rangle = a_{pm} | pm_z rangle .$
The expected value of the observable An in the probability distribution ρ is

$latex rho_+ a_+^n + rho_- a_-^n = {rm Tr} rho A^n .$
In the last equation we have used the fact that all of our “operators” can be thought of as two by two matrices acting on a two dimensional space of vectors whose basis elements are |±z 〉. The matrices can be multiplied by the usual rules and the trace of a matrix is just the sum of its diagonal elements. Our matrices are

$latex s_z = left( begin{array}{ccc} 1 & 0 cr 0 & -1 end{array} right),$
$latex A = left( begin{array}{ccc} a_+ & 0 cr 0 & a_- end{array} right),$
$latex rho = left( begin{array}{ccc} rho_+ & 0 cr 0 & rho_- end{array} right),$
$latex Q = left( begin{array}{ccc} 1 & 0 cr 0 & 0end{array} right).$
They’re all diagonal, so it’s easy to multiply them.

So far all we’ve done is rewrite the simple logic of a single bit as a complicated set of matrix equations, but consider the operation of flipping the orientation of the molecule, which for nefarious purposes we’ll call sx,

$latex s_x | pm rangle = | mp rangle .$
This has matrix

$latex s_x = left( begin{array}{ccc} 0 & 1 cr 1 & 0end{array} right).$
Note that sz2 = sx2 = 1, and sx sz = − sz sx = − i sy , where the last equality is just a definition. This definition implies that sy sa = − sa sy, for a = x or a = z, and it follows that sy2 = 1. You can verify these equations by using matrix multiplication, or by thinking about how the various operations operate on the states (which I think is easier). Now consider for example the quantity B ≡ bx sx + bz sz . Then B2 = bx2 + bz2 , which suggests that B is a quantity which takes on the possible values ±√{b+2 + b2}. We can calculate

$latex {rm Tr} rho B^n ,$
for any choice of probability distribution. If n = 2k it’s just

$latex (b_x^2 + b_z^2)^k ,$
whereas if n = 2k + 1 it’s

$latex (b_x^2 + b_z^2)^k (p_+ b_z – p_- b_z) .$
This is exactly the same result we would get if we said that there was a probability P+ (B) for B to take on the value √{bz2 + bx2} and probability P (B) = 1 − P+ (B), to take on the opposite value, if we choose

$latex P_+(B)equiv displaystyle{frac{1}{2} left(1 + frac{(p_+ – p_-)b_z}{sqrt{b_z^2 + b_x^2}}right)}.$
The most remarkable thing about this formula is that even when we know the answer to Q with certainty (p+ = 1 or 0), B is still uncertain.

We can repeat this exercise with any linear combination bx sx + by sy + bz sz. We find that in general, if we force one linear combination to be known with certainty, that all linear combinations where the vector (cx, cy, cz) is not parallel to (bx , by, bz) are uncertain. This is the same as the condition guaranteeing that the two linear combinations commute as matrices.

Pursuing the mathematics of this further would lead us into the realm of eigenvalues of Hermitian matrices, complete ortho-normal bases and other esoterica. But the main point to remember is that any system we can think about in terms of classical logic inevitably contains in it an infinite set of variables in addition to the ones we initially thought about as the maximum set of things we thought could be measured. When our original variables are known with certainty, these other variables are uncertain but the mathematics gives us completely determined formulas for their probability distributions.

Another disturbing fact about the mathematical probability theory for non-compatible observables that we’ve discovered, is that it does NOT satisfy the probability sum rule. This is because, once we start thinking about incompatible observables, the notion of either this or that is not well defined. In fact we’ve seen that when we know “definitely for sure” that sz is 1, the probability for B to take on its positive value could be any number between zero and one, depending on the ratio of bz and bx.

Thus QM contains questions that are neither independent nor dependent and the probability sum rule P(sz or B ) = P(sz) + P(B) does not make sense because the word or is undefined for non-commuting operators. As a consequence we cannot apply the conditional probability rule to general QM probability predictions. This appears to cause a problem when we make a measurement that seems to give a definite answer. We’ll explain below that the issue here is the meaning of the word measurement. It means the interaction of the system with macroscopic objects containing many atoms. One can show that conditional probability is a sensible notion, with incredible accuracy, for such objects, and this means that we can interpret QM for such objects as if it were a classical probability theory. The famous “collapse of the wave function” is nothing more than an application of the rules of conditional probability, to macroscopic objects, for which they apply.

The double slit experiment famously discussed in the first chapter of Feynman’s lectures on quantum mechanics, is another example of the failure of the probability sum rule. The question of which slit the particle goes through is one of two alternative histories. In Newton’s equations, a history is determined by an initial position and velocity, but Heisenberg’s famous uncertainty relation is simply the statement that position and velocity are incompatible observables, which don’t commute as matrices, just like sz and sx. So the statement that either one history or another happened does not make sense, because the two histories interfere.

Before leaving our little ammonia molecule, I want to tell you about one more remarkable fact, which has no bearing on the rest of the discussion, but shows the remarkable power of quantum mechanics. Way back at the top of this post, you could have asked me, “what if I wanted to orient the ammonia along the x axis or some other direction”. The answer is that the operator nx sx + ny sy + nz sz, where (nx , ny, nz) is a unit vector, has definite values in precisely those states where the molecule is oriented along this unit vector. The whole quantum formalism of a single bit, is invariant under 3 dimensional rotations. And who would have ever thought of that? (Pauli, that’s who).

The fact that QM was implicit in classical physics was realized a few years after the invention of QM, in the 1930s, by Koopman. Koopman formulated ordinary classical mechanics as a special case of quantum mechanics, and in doing so introduced a whole set of new observables, which do not commute with the (commuting) position and momentum of a particle and are uncertain when the particle’s position and momentum are definitely known. The laws of classical mechanics give rise to equations for the probability distributions for all these other observables. So quantum mechanics is inescapable. The only question is whether nature is described by an evolution equation which leaves a certain complete set of observables certain for all time, and what those observables are in terms of things we actually measure. The answer is that ordinary positions and momenta are NOT simultaneously determined with certainty.

Which raises the question of why it took us so long to notice this, and why it’s so hard for us to think about and accept. The answers to these questions also resolve “the problem of quantum measurement theory”. The answer lies essentially in the definition of a macroscopic object. First of all it means something containing a large number N of microscopic constituents. Let me call them atoms, because that’s what’s relevant for most everyday objects. For even a very tiny piece of matter weighing about a thousandth of a gram, the number N  ∼ 1020. There are a few quantum states of the system per atom, let’s say 10 to keep the numbers round. So the system has 101020 states. Now consider the motion of the center of mass of the system. The mass of the system is proportional to N, so Heisenberg’s uncertainty relation tells us that the mutual uncertainty of the position and velocity of the system is of order [1/N]. Most textbooks stop at this point and say this is small and so the center of mass behaves in a classical manner to a good approximation.

In fact, this misses the central point, which is that under most conditions, the system has of order 10N different states, all of which have the same center of mass position and velocity (within the prescribed uncertainty). Furthermore the internal state of the system is changing rapidly on the time scale of the center of mass motion. When we compute the quantum interference terms between two approximately classical states of the center of mass coordinate, we have to take into account that the internal time evolution for those two states is likely to be completely different. The chance that it’s the same is roughly 10−N, the chance that two states picked at random from the huge collection, will be the same. It’s fairly simple to show that the quantum interference terms, which violate the classical probability sum rule for the probabilities of different classical trajectories, are of order 10−N. This means that even if we could see the [1/N] effects of uncertainty in the classical trajectory, we could model them by ordinary classical statistical mechanics, up to corrections of order 10−N.

It’s pretty hard to comprehend how small a number this is. As a decimal, it’s a decimal point followed by 100 billion billion zeros and then a one. The current age of the universe is less than a billion billion seconds. So if you wrote one zero every hundredth of a second you couldn’t write this number in the entire age of the universe. More relevant is the fact that in order to observe the quantum interference effects on the center of mass motion, we would have to do an experiment over a time period of order 10N. I haven’t written the units of time. The smallest unit of time is defined by Newton’s constant, Planck’s constant and the speed of light. It’s 10− 44 seconds. The age of the universe is about 1061 of these Planck units. The difference between measuring the time in Planck times or ages of the universe is a shift from N = 1020 to N = 1020 − 60, and is completely in the noise of these estimates. Moreover, the quantum interference experiment we’re proposing would have to keep the system completely isolated from the rest of the universe for these incredible lengths of time. Any coupling to the outside effectively increases the size of N by huge amounts.

Thus, for all purposes, even those of principle, we can treat quantum probabilities for even mildly macroscopic variables, as if they were classical, and apply the rules of conditional probability. This is all we are doing when we “collapse the wave function” in a way that seems (to the untutored) to violate causality and the Schrodinger equation. The general line of reasoning outlined above is called the theory of decoherence. All physicists find it acceptable as an explanation of the reason for the practical success of classical mechanics for macroscopic objects. Some physicists find it inadequate as an explanation of the philosophical “paradoxes” of QM. I believe this is mostly due to their desire to avoid the notion of intrinsic probability, and attribute physical reality to the Schrodinger wave function. Curiously many of these people think that they are following in the footsteps of Einstein’s objections to QM. I am not a historian of science but my cursory reading of the evidence suggests that Einstein understood completely that there were no paradoxes in QM if the wave function was thought of merely as a device for computing probability. He objected to the contention of some in the Copehagen crowd that the wave function was real and satisfied a deterministic equation and tried to show that that interpretation violated the principles of causality. It does, but the statistical treatment is the right one. Einstein was wrong only in insisting that God doesn’t play dice.

Once we have understood these general arguments, both quantum measurement theory and our intuitive unease with QM are clarified. A measurement in QM is, as first proposed by von Neumann, simply the correlation of some microscopic observable, like the orientation of an ammonia molecule, with a macro-observable like a pointer on a dial. This can easily be achieved by normal unitary evolution. Once this correlation is made, quantum interference effects in further observation of the dial are exponentially suppressed, we can use the conditional probability rule, and all the mystery is removed.

It’s even easier to understand why humans don’t “get” QM. Our brains evolved according to selection pressures that involved only macroscopic objects like fruit, tigers and trees. We didn’t have to develop neural circuitry that had an intuitive feel for quantum interference phenomena, because there was no evolutionary advantage to doing so. Freeman Dyson once said that the book of the world might be written in Jabberwocky, a language that human beings were incapable of understanding. QM is not as bad as that. We CAN understand the language if we’re willing to do the math, and if we’re willing to put aside our intuitions about how the world must be, in the same way that we understand that our intuitions about how velocities add are only an approximation to the correct rules given by the Lorentz group. QM is worse, I think, because it says that logic, which our minds grasp as the basic, correct formulation of rules of thought, is wrong. This is why I’ve emphasized that once you formulate logic mathematically, QM is an obvious and inevitable consequence. Systems that obey the rules of ordinary logic are special QM systems where a particular choice among the infinite number of complementary QM observables remains sharp for all times, and we insist that those are the only variables we can measure. Viewed in this way, classical physics looks like a sleazy way of dodging the general rules. It achieves a more profound status only because it also emerges as an exponentially good approximation to the behavior of systems with a large number of constituents.

To summarize: All of the so-called non-locality and philosophical mystery of QM is really shared with any probabilistic system of equations and collapse of the wave function is nothing more than application of the conventional rule of conditional probabilities. It is a mistake to think of the wave function as a physical field, like the electromagnetic field. The peculiarity of QM lies in the fact that QM probabilities are intrinsic and not attributable to insufficiently precise measurement, and the fact that they do not obey the law of conditional probabilities. That law is based on the classical logical postulate of the law of the excluded middle. If something is definitely true, then all other independent questions are definitely false. We’ve seen that the mathematical framework for classical logic shows this principle to be erroneous. Even when we’ve specified the state of a system completely, by answering yes or no to every possible question in a compatible set, there are an infinite number of other questions one can ask of the same system, whose answer is only known probabilistically. The formalism predicts a very definite probability distribution for all of these other questions.

Many colleagues who understand everything I’ve said at least as well as I do, are still uncomfortable with the use of probability in fundamental equations. As far as I can tell, this unease comes from two different sources. The first is that the notion of “expectation” seems to imply an expecter, and most physicists are reluctant to put intelligent life forms into the definition of the basic laws of physics. We think of life as an emergent phenomenon, which can’t exist at the level of the microscopic equations. Certainly, our current picture of the very early universe precludes the existence of any form of organized life at that time, simply from considerations of thermodynamic equilibrium.

The frequentist approach to probability is an attempt to get around this. However, its insistence on infinite limits makes it vulnerable to the question about what one concludes about a coin that’s come up heads a million times. We know that’s a possible outcome even if the coin and the flipper are completely honest. Modern experimental physics deals with this problem every day both for intrinsically QM probabilities and those that arise from ordinary random and systematic fluctuations in the detector. The solution is not to claim that any result of measurement is definitely conclusive, but merely to assign a confidence level to each result. Human beings decide when the confidence level is high enough that we “believe” the result, and we keep an open mind about the possibility of coming to a different conclusion with more work. It may not be completely satisfactory from a philosophical point of view, but it seems to work pretty well.

The other kind of professional dissatisfaction with probability is, I think, rooted in Einstein’s prejudice that God doesn’t play dice. With all due respect, I think this is just a prejudice. In the 18th century, certain theoretical physicists conceived the idea that one could, in principle, measure everything there was to know about the universe at some fixed time, and then predict the future. This was wild hubris. Why should it be true? It’s remarkable that this idea worked as well as it did. When certain phenomena appeared to be random, we attributed that to the failure to make measurements that were complete and precise enough at the initial time. This led to the development of statistical mechanics, which was also wildly successful. Nonetheless, there was no real verification of the Laplacian principle of complete predictability. Indeed, when one enquires into the basic physics behind much of classical statistical mechanics one finds that some of the randomness invoked in that theory has a quantum mechanical origin. It arises after all from the motion of individual atoms. It’s no surprise that the first hints that classical mechanics was wrong came from failures of classical statistical mechanics like the Gibbs paradox of the entropy of mixing, and the black body radiation laws.

It seems to me that the introduction of basic randomness into the equations of physics is philosophically unobjectionable, especially once one has understood the inevitability of QM. And to those who find it objectionable all I can say is “It is what it is”. There isn’t anymore. All one must do is account for the successes of the apparently deterministic formalism of classical mechanics when applied to macroscopic bodies, and the theory of decoherence supplies that account.

Perhaps the most important lesson for physicists in all of this is not to mistake our equations for the world. Our equations are an algorithm for making predictions about the world and it turns out that those predictions can only be statistical. That this is so is demonstrated by the simple observation of a
Geiger counter and by the demonstration by Bell and others that the statistical predictions of QM cannot be reproduced by a more classical statistical theory with hidden variables, unless we allow for grossly non-local interactions. Some investigators into the foundations of QM have concluded that we should expect to find evidence for this non-locality, or that QM has to be modified in some fundamental way. I think the evidence all goes in the other direction: QM is exactly correct and inevitable and “there are more things in heaven and earth than are conceived of in our naive classical philosophy”. Of course, Hamlet was talking about ghosts…

CATEGORIZED UNDER: Guest Post, Science, Top Posts
  • http://mattleifer.info Matt Leifer

    I don’t think most people who are concerned about the measurement problem are concerned about determinism. Rather, they are concerned about realism, i.e. the idea that there is something out there that exists independently of observers. If you want to be a realist then it is not so easy to pass off the quantum state as something akin to a probability distribution. See this recent preprint:

    http://arxiv.org/abs/1111.3328

    for a simple demonstration of this.

  • phayes

    Good post. Apparently the “Gibbs Paradox” isn’t really a paradox and failure of classical stat. mech. though: http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf

  • ursus_maritimus

    Very enlightening post! Can Tom or someone else who grasps the math elucidate if and how the arguments espoused here pertain to quantum entanglement-was Einstein wrong to consider such apparent non-locality ‘spooky action at a distance’ as he was in stating that ‘God does not play dice?’

  • http://www.mccaughan.org.uk/g/ g

    Rabbi Eliezer ben Yaakov = Eliezer Yudkowsky? (In which case: why Nahariya?) Or is there some actual 6th-century rabbi who said something about three things students ought to be taught? (In which case: who and what? Google doesn’t seem to know.)

  • http://arunsmusings.blogspot.com Arun

    Note that if we think of the probability function P(x,t) for the hurricane to hit a point x and time t to be a physical field, then this procedure seems non-local or a-causal. The field changes instantaneously to zero at Galveston as soon as we make a measurement in New Orleans. Furthermore, our procedure “violates the weather equations”. Weather evolution seems to have two kinds of dynamics. The deterministic, local, evolution of P(x,t) given by the equation, and the causality violating projection of the probability of Galveston to zero and rescaling of the probability of New Orleans to one, which is mysteriously caused by the measurement process.

    If you make the prediction on Sunday for where the hurricane might be on Wednesday, then your Wednesday measurement does this seemingly non-local or a-causal rescaling of probabilities. But if you make the prediction on Sunday 5 PM about where the hurricane might be at 5:01 PM, there is no non-locality or a-causality apparent when you make the measurement at 5:01 PM.

    “The most remarkable thing about this formula is that even when we know the answer to Q with certainty (p+ = 1 or 0), B is still uncertain.”

    Though B is uncertain, you can measure Q and B together, can you not? If Q is known with certainty, then measuring B first does not change Q, I suppose. In quantum mechanics, measuring B first will throw Q into a state of uncertainty, even if the initial state was prepared with a definite Q?

  • Tim Maudlin

    Unfortunately, the understanding of the interpretive difficulties of quantum mechanics in this article is incorrect, so the discussion does not touch the important issues. The measurement problem of standard quantum mechanics has nothing at all to do with the observability or unobservability of interference in macroscopic systems, so consideration of how decoherence can suppress interference does nothing to solve the problem. Anyone reading Schrödinger’s original “cat” paper can see that possible interference is not among the issues he discusses. He is rather concerned about how the cat ends up either alive or dead. If one wants to have the outcome be due to an irreducibly stochastic dynamics, fine: then explain clearly what the possible physical states of the system are and provide a stochastic evolution law for those states. Banks apparently agrees with the contention of the EPR paper that the wavefunction does not provide a complete physical description. Then what does? This is not asking for determinism: the dynamics can be as indeterministic as you like. It is asking for a clear physical theory.

    The discussion of non-locality and Bell is completely off-target. Einstein was not worried about indeterminism or “God playing dice”: he was worried about the evident non-locality of the standard theory with collapse if one takes the wavefunction to be complete. Thus the title of the EPR paper. If one wants to avoid the non-locality, as EPR correctly argues, then the results of (e.g.) spin measurements must be predetermined by local state of the particles measured. But any such predetermination, together with a prohibition on non-local effects, implies that Bell inequality cannot be violated. What Bell proved is that no local theory can reproduce the predictions of QM. To suggest that the non-locality is merely an apparent effect of conditionalization shows that one has not understood Bell at all: see his discussion of Bertlmann’s socks.

  • Chris

    “All physicists find it [decoherence] acceptable as an explanation of the reason for the practical success of classical mechanics for macroscopic objects”.

    Really? I’m sure I’ve seen objections to it.

    The unsatisfying philosophical aspect is that you leave macroscopic states still in a superposition, with no way of ever resolving it, whether or not they can interfere with each other.

    And, can’t you get at that superposition by making it have effects back in the microworld?
    Suppose I observe Schrodinger’s cat alive or dead (or count clicks from a Geiger counter, whatever). So according to decoherence theory that now puts my brain (and any part of the world my actions influence) into a superposition, but I can’t observe interference effects because the state vector of my brain has too many dimensions to ever get the two states to line up.
    So, now I walk across the room, and depending on how I saw the cat, close one or other of the slits in a two-slit experiment. Shouldn’t I now see an interference pattern on the screen, in violation of presumably everyone’s expectations? We can put the necessary laser in as fancy an isolating box as you need so that it isn’t affected by my thought processes or manipulation of the slits.

  • Chris

    I don’t get what the ammonia example is supposed to demonstrate at all:
    So, we take the system either definitely up or down (Q). Then we do some junk with matrices, which are all still completely defined, and then we define B in terms of coefficients b_x and b_z that we just made up. Then we act surprised that the value we get for B depends on the coefficients we put in?

    I mean this is how classical mechanics works in that if you take a known number and multiply it by random factors you get a random number. But it’s nothing to do with how the quantum mechanics of spin works (there s_x and s_y really do represent the two other directions and the issue is that they aren’t allowed to take on the intermediate values you’d want when you observe them).

  • Marty

    Tom,

    Am hopeful you can clarify/explain a couple of things.

    As you point out, a small macroscopic system of N atoms has order 10^N relevant states, assuming ~10 relevant states per atom. Let’s say this system is a measuring device. Now assume you have prepared a pure superposition state involving at most a few particles in an ultra-cold environment, carefully designing the experiment to isolate the prepared superposition from interference with the apparatus and the rest of the environment.

    Finally, you make a measurement to resolve the superposition, i.e., induce a “wave function collapse” in the usual interpretation. It seems you argue there is nothing like a wave function collapse, just an interaction with the macroscopic system of order 10^N states that behaves classically due to exponential suppression of quantum interference effects. (This last statement is probably a bit of an oversimplification of your view, but hopefully it captures the spirit.)

    Now for the questions. Chris (#7) mentions some already, for which I look forward to seeing answers. A couple of others are below, following their context.

    1. Surely the measuring device (and other parts of the apparatus, especially the part that prepared the superposition in the first place) must in some sense be considered part of the same quantum system as the superposition undergoing measurement; otherwise, how could the measuring device interact with it in a way consistent with quantum mechanics? But the measuring device, superposition-preparation subsystem, and superposition undergoing measurement must largely factor, or else there is no separation between “measurer” and “measured” that would make a “measurement” mean something.

    The question: In the decoherence picture, how does the essentially classical apparatus (so deemed because it has “gobs” of atoms) produce a quantum mechanical superposition in a very predictable manner, i.e., through progression from the classical to quantum? Especially, is the exponential suppression of quantum interference effects at all relevant here? (I’m struggling to phrase this question clearly, so I’ll understand if you don’t know what I’m getting at.)

    2. For at least some kinds of measurements, the interaction with the apparatus will need to be highly localized, maybe even to a single atom initially (but amplified by further interactions with neighboring atoms, e.g., by cascading effects, until the “measurement event” becomes macroscopically detectable). I’m not clear how such a localized interaction (e.g., with one to tens of atoms) can take advantage of the 10^N states of the measuring device to avoid a wave function collapse picture. For example, I think cascading effects can be viewed classically, e.g., one electron knocks an electron free from multiple atoms after being accelerated in a classical electric field, so a classical picture would seem to arise long before anywhere close to N atoms are involved.

    The questions: At what point does decoherence become prominent when initial interaction with the measuring device is highly localized? How does the wave function of the superposition evolve during this time if it doesn’t “collapse” after the interaction, and is this evolution reversible in some sense (as opposed to collapse, which would not be reversible)?

  • http://www.scottaaronson.com Scott Aaronson

    Tim Maudlin #6: The view that I take Banks to be defending here is actually one I’ve found extremely common among physicists, so maybe it would be worth philosophers trying to understand it sympathetically and seeing how much sense they can make of it. I like to think of this view as “Many Worlds minus the Many Worlds” — i.e., many worlds without calling it that, or even acknowledging a need to discuss that apparent implication of what you’re saying. On the one hand, you view a measurement as just an ordinary, unitary physical interaction, albeit one that “looks and smells measurement-like”—i.e., that exponentially suppresses the off-diagonal terms in a suitable density matrix, because of decoherence theory. On the other hand, you view the reduction of the state vector as completely analogous to ordinary Bayesian conditioning. What are you conditioning on, in this case? Well, presumably, which block of the now-block-diagonal density matrix you’re now “in”! So basically, you get to play a double game: treating the state vector “realistically” for the purpose of understanding unitary evolution (including the entanglement of the system and measuring device that causes decoherence), but then “ontologically” for the final step of the Born rule and state-vector reduction. The gap—i.e., the obvious disanalogies between what we’re doing now and ordinary Bayesian conditioning—are bridged over by
    (1) stressing just how drastically the macroscopic interference terms are suppressed, and therefore how unlikely it is that we’ll ever run into problems in practice, and
    (2) saying “well, this is quantum mechanics, a perfectly-natural non-commutative generalization of ordinary Bayesian probability theory. If you find it unintuitive, then the problem is with your intuition.”

    Chris #7: Alas, your proposed experiment doesn’t work (which is a pity, because otherwise we could presumably use it to distinguish different interpretations of quantum mechanics!). If your brain is in a different state depending on which measurement outcome you saw, that will already be enough to prevent an interference pattern, according to the standard rules of QM. (Nothing special about your brain here: everything else in the entire universe needs to be the same in states S1 and S2 in order to observe interference between them. That’s why building a quantum computer and other quantum-mechanical experiments are so hard!)

    OXO #9: Rabbi Mordecai ben Aharon of Mitzpe Ramon told his disciplines, “he who cannot skim past a fizzled, groan-inducing introductory joke to get to the meat, will not get far in his studies of quantum physics.”

  • Michael Bacon

    I’m sympathetic to Tim Maudlin’s complaints about the post, but have a somewhat different perspective.

    Banks seems to say that physicists start to go astray when they try to “attribute physical reality to the Schrodinger wave function.” Tim might agree with this aspect of Tom’s post. I believe the opposite is true.

    A new paper entitled “The quantum state can not be interpreted statistically” (See: http://arxiv.org/PS_cache/arxiv/pdf/1111/1111.3328v1.pdf), clearly highlights this issue.

    The authors conclude that given only very mild assumptions, the statistical interpretation of the quantum state is inconsistent with the predictions of quantum theory. This result holds even in the presence of small amounts of experimental noise, and is therefore amenable to experimental test using present or near-future technology. If the predictions of quantum theory are con rmed, such a test would show that distinct quantum states must correspond to physically distinct states of reality.

  • Jeff

    Nice read but what about entanglement though? In my textbook’s interpretation, when I measure a particle, of an entangled pair, the wavefunction of the other collapses too, as if I had measured it. I can’t control what it collapses to, but surely it must collapse? (To ensure the correlations seen in Bell like experiments I mean). I don’t see how decoherence theory and non-local Bayesian conditioning fit into this picture (where a wavefunction really collapsing makes the experiment much easier to understand).

    In spite of my question/objection I really enjoyed the article, thanks!

  • David Brown

    “… the introduction of basic randomness into the equations of physics is philosophically unobjectionable …” Can quantum theory explain the vacuum catastrophe and the space roar? It is possible for an electron to tunnel through a potential barrier, but can such a paradoxical electron carry a kinetic energy that is arbitrarily large? Consider Wolfram’s Cosmological Principle: The maximum physical wavelength is the Planck length times the Fredkin-Wolfram constant.
    http://en.wikipedia.org/wiki/A_New_Kind_of_Science
    Is there a final verdict on quantum theory versus Wolfram’s “A New Kind of Science”?

  • tom banks

    I’m interested in the claim that the statistical interpretation of quantum theory is inconsistent with the quantum theory, but haven’t read that paper. If you’ll excuse my making a skeptical comment without reading it:

    When I was taught QM and when I teach it, I learned/teach that the mathematical content of the theory is a set of “expectation values”. That is, all the theory defines is a set of probability amplitudes and Born’s rule tells us how to convert those complex numbers into a set of probabilities for measuring something in a certain experiment. My experimental colleagues compare these probabilistic predictions to repeated measurements. Sometimes the quantum predictions end up being “certain” with a great degree of accuracy, as for example when we predict the specific heat of some material. That’s a macroscopic property and we don’t really need to do repeated measurements to get it right, beyond eliminating experimental errors.

    But the basic predictions of QM are probabilities, so I don’t understand how anything but a statistical interpretation of it is possible.

    I have a certain amount of sympathy for the person who said I want to have the many worlds interpretation without the many worlds, but I would put it a little differently: the many worlds interpretation puts an unnecessary and in my view wrong philosophical framework over these questions by insisting that the wave function is a real thing. I claim it’s no more real than ANY probability distribution.

    I think, at base, the problem is that the notion of reality that some writers like to cling to is a false one. It’s a concept that sits inside human brains, based on their experience with/construction by macroscopic objects, where everything definitely happens or doesn’t happen.

    As Dyson said, there is no earthly reason for the underlying rules of the world to “make sense” to us, because we’re a very special kind of system in the world, which was conditioned to perform certain tasks related to survival. For me, reality is nothing but the set of results of experiments that some macroscopic object can perform at some time in the history of the universe.
    Those are the only things that obey, or can be expected to obey, the rules of logic that our brains understand “intuitively” (sometimes with a lot of work). A physical theory is a mathematical algorithm, invented by humans, to make predictions about future events given some measurements of present events (there are quantum gravity issues about what past and future and time itself mean, but IMHO, these issues about the interpretation of QM can be discussed without getting into that). It has to satisfy rules that are not just opinions but things that anyone with enough mathematical sophistication can work through by themselves and come to the same conclusions as anyone else doing it. And it has to agree with experiment, with the maximum precision you can do the experiment.
    That’s all it has to do.

    I can see that I’m not going to be able to keep up with all the comments here and answer even those that are phrased politely. Let me just comment on this last one, about the entangled pair. QM makes a prediction like:
    the state of the pair is (a | + – > + b |- +>) |N>. The meaning of this state is that there’s a probability amplitude for either particle to have spin up, but the spins are anti-correlated with probability one. The second factor in this wave function is the “ready to measure” state of the apparatus. It’s actually a huge collection of states, with N labeling just one average value like the position of a macroscopic needle. When I make a measurement, the wave function becomes
    (a | + – > | N_+ > + b |+ – > | N_- > ) where |N_+ > is one of the many states with the needle UP and | N_- > one of the many states with the needle down. The microstates with the needle up or down don’t even have the same exact energies, and we’re certainly not in an energy eigenstate. The microstate is changing rapidly during the course of the experiment. I do the measurement locally at the position of particle 2 on the planet earth, and I’ve sent particle 1 to the (formerly known as) planet Pluto. The new wave function says “there’s probability |a|^2 for the needle to be up and |b|^2 for the needle to be down, and the interference between these two possibilities is so small that you couldn’t measure in a time so long that it doesn’t matter if I measure it in Planck times or ages of the universe”. That’s all it says. So, as with any probabilistic theory, you do the same thing many times and compare the frequencies to the predicted probabilities, and after doing that enough times you become convinced that the predictions are correct with some confidence level. Look at any paper on experimental physics. That’s how you’ll see the results quoted.

    There’s nothing non-local here. You set up the correlation a long time ago, and waited for particle 1 to get to Pluto. That journey was carried out at a velocity less than or equal to the velocity of light. You assure me that you arranged the particles to be absolutely anti-correlated in spin when you sent them on their way and that nothing has intervened in the meantime to flip the spin of the particle I don’t have in my lab, relative to the one that I do. All of these assurances are implicit in the statement that “the state of the system before it interacts with the apparatus is thus and such”. So my conclusion that the spin on Pluto is up whenever I measure the spin on earth to be down, is not based on spooky action at a distance, but on things you’ve told me that you’ve done in (causally) setting up the experiment.

  • Mark

    The old Copenhagen interpretation of quantum mechanics wasn’t just philosophically incomplete, but it was also physically incomplete: There was supposedly an ill-defined “Heisenberg cut” that divided classical systems from quantum systems, so the world was somehow described by two different frameworks without a well-defined line dividing them.

    But nowadays, with density matrices and decoherence, there are no systems (maybe apart from questions in cosmology) that we cannot in principle treat quantum-mechanically. For macroscopic systems it becomes intractable, of course, but there are no more Heisenberg cuts. We can see why systems start behaving more classically as they get bigger.

    So quantum mechanics, as written in terms of density matrices and decoherence, is at least physically complete, if not philosophically complete. That makes the problem of interpretation less exciting for many physicists.

    But let’s probe the philosophical incompleteness question for a moment. In classical mechanics, one can say that a system described by a given probability distribution is actually occupying one of its possible states — we just don’t know which. But we can say the same thing for a quantum mechanical system if we insist: We can always say that it is secretly occupying the state labeled by one of its density-matrix eigenvectors. (And, apart from physically unachievable measure-zero scenarios, there are no true degeneracies in any density matrix, so its eigenbasis and thus its list of eigenvectors is unique.)

    Sure, sometimes making this insistence permits superluminal changes in a particle’s state, as in certain experiments involving EPR pairs. But superluminality that doesn’t transmit observable information — and quantum mechanics elegantly satisfies this condition — does not violate causality or special relativity. There are lots of such phenomena already, from phase velocities to even the Higgs when it first goes tachyonic in the early universe before condensing to its present-day vacuum expectation value. Indeed, a beautiful way to understand the necessity of antiparticles (see Weinberg’s book on Gravitation) is that the Heisenberg uncertainty principle permits particles to jump occasionally outside their light cones — in an unpredictable and therefore information-empty way — which must then be seen by other Lorentz observers as antiparticles.

    Also, sometimes it’s possible to imagine that some sufficiently big system — the whole universe? — remains always in a pure state, even though every human being inside has a nice, mixed density matrix. (That presumes that there is a well-defined, closed system that we can call the universe — still an open question.) But a person is not the universe, so we’re not speaking about the same system anymore. Taking seriously the ontological status of one of a system’s density-matrix eigenvectors requires letting go of the classical idea that systems and their sub- and supersystems must all be in definite states that same classically consistent with each other. But so what? I can still say that any given system is definitely in one state or another, our logical prejudices aside.

    So realism is possible, as long as your definition of a state is something labeled by an eigenvector of a density matrix, and as long as you don’t care about superluminal effects that never transmit any actual information, information being defined according to the precise quantum Shannon entropy formula.

    There’s a thorough discussion of these points in an earlier Cosmic Variance comments section — see http://blogs.discovermagazine.com/cosmicvariance/2010/12/12/interview-on-static-limit/ .

  • Carl

    #6 “What Bell proved is that no local theory can reproduce the predictions of QM.”

    Not quite, but this is a common misconception. What Bell (and successors) proved is that no local, realist theory is consistent with QM. That is, you either have to give up on locality or realism (or both). It’s a subtle, important, and often neglected distinction.

    Bell himself said that DeBroglie-Bohm theory, which foresakes locality for the sake of realism, was consistent with his theorem and he didn’t understand why it didn’t get taken more seriously.

  • http://www.scottaaronson.com Scott Aaronson

    David Brown #13:

    Is there a final verdict on quantum theory versus Wolfram’s “A New Kind of Science”?

    I would say the verdict was already in long before ANKOS hit the printing presses! ;-)

    For more details, see a review of ANKOS that I wrote back in 2002. There I examine Wolfram’s “long-range thread” idea to account for the Bell inequality violations in classical cellular automata. I show that, even after we’ve given up on locality, the thread idea can’t be made compatible with Wolfram’s own requirements of determinism and relativistic invariance. (A similar argument would be made famous a few years later by John Conway and Simon Kochen, under the catchy name “The Free-Will Theorem.”)

  • Michael Bacon

    Carl #16,

    “What Bell (and successors) proved is that no local, realist theory is consistent with QM. That is, you either have to give up on locality or realism (or both).”

    Not quite, but this is a common misconception. ;)

    What Bell proved is that assuming that every possible measurement – even if not performed – would have yielded a single definite result, then no local, realist theory is consistent with QM.

    This assumption is called contra-factual definiteness or CFD. What Bell really proved was that every quantum theory must either violate locality or CFD. The many-worlds theory, with its multiplicity of results in different worlds, violates CFD and thus is both local and “realistic”.

  • Carl

    Tom, this is as good or better an explanation as I’ve read (as in, “damn, I wish I’d thought of that”).

    Having said that, I do have a couple of nitpicks, and a question:

    1) The Katrina analogy is a little misleading. If we think of the probability as a field, which I think is a very enlightening picture, it doesn’t actually collapse to zero in Galveston when we measure Katrina in New Orleans. In fact, the field has continuously evolved until the probability of it hitting Galveston became vanishingly small. When we measure Katrina and find it in New Orleans, the only thing that “collapsed” was our lack of knowledge; the probability field itself only ever evolved. I’m almost certain that this is what you meant, but it wasn’t entirely clear.

    2) I think that the whole confusion about “collapse” and the “measurement problem” is a very unfortunate historical accident. In the earliest days of QM it’s clear that people were doing the best they could with adapting profoundly counterintuitive concepts to otherwise inexplicable experimental results. Two things were clear: very small, isolated systems obeyed QM; and macroscopic systems appeared classical. Given how hard it was to get quantitative predictions about even the simplest of systems correct, the question of what happened between microscopic and macroscopic could be set aside for another day. Unfortunately, somewhere along the way the concepts of “collapse” and “measurement” went from being placeholders for “we don’t know, we’ll get back to it when we have a better grasp of the fundamentals” to being dogma.

    3) You talk about how overwhelmingly large the number of states is for a modestly sized macroscopic object. I think it would be useful (especially if you ever decided to expand this for a more general audience than the physics undergrad) to illustrate just how rapidly the number of states grows for even a handful of atoms. Even a system of just 100 atoms has 10^100 states. To put in non-technical terms, if we drew a graph of the number of states, the transition from “purely quantum” to “apparently classical” is so steep and rapid, it’s easy to see how it could be mistaken for instantaneous.

    Finally, the question: is it possible to conceive of an experiment that would clearly distinguish between “decoherence” and “collapse”? Presumably this would start with a system that is purely quantum (say, a simple double slit) and then “measure” the outcome of that experiment not with an enormously macroscopic instrument such as a photographic film but instead with a microscopic system of, say, a few tens of atoms — small enough that the “instrument” still has measurably quantum behavior. This would definitively demonstrate that the ideas “instantaneous collapse” and a special role for observers or measurements are simply ill-conceived.

  • phayes

    “I have a certain amount of sympathy for the person who said I want to have the many worlds interpretation without the many worlds” –tom banks.

    That was Scott Aaronson in comment #10 but to me it looks much more like the relational interpretation:

    http://en.wikipedia.org/wiki/Relational_quantum_mechanics

  • Carl

    Michael #18: Yes, you are correct. I was being rather too casual with the use of the word “realistic”, in trying to emphasize that Bell does not require you to give up locality if you are willing to give up other things :-(

    On a different but related note, I’ve never understood the appeal of Many Worlds to a professional physicist. Surely it has all the problems of the Copenhagen interpretation (e.g. it replaces “instantaneous collapse” with “instantaneous splitting”, and you still have to define what a “measurement” is in order to say when the splitting of universes takes place) and then piles on top of it an infinity of undetectable universes. (I suspect its appeal to amateurs is because it sounds so science-fictiony and the philosophical implications are perfect fodder of late night undergraduate conversations…)

  • Michael Bacon

    Carl #21,

    Not an expert by any means, certainly an “amateur”, but a long ways away in years from being an undergraduate. :)

    Nevertheless, it seems to me that that many-worlds predicts/retrodicts that wavefunctions appear to collapse when measurement- like interactions and processes occur via decoherence, but claims that the wavefunction does not actually collapse but continues to evolve according to the usual wave-equation. This, I believe, emerges naturally from the linearity of the wave-equation.

    People have attempted to construct non-linear theories so that microscopic systems are approximately linear and obey the wave equation, while macroscopic systems are grossly non-linear and generate collapse. I think Scott Aaronson’s comment above addresses this issue from a more logical/philosophical perspective.

    My understanding is that from a technical perspective all these efforts have made additional predictions which, when tested, have failed. Another reason for doubting that any collapse actually takes place is that the collapse would have to propagate instantaneously. Not fatal, but unpleasant and difficult to reconcile with special relativity and some conservation laws.

    Reason enough, it seems to me, for professional physicists to at least take it seriously.

  • http://www.scottaaronson.com Scott Aaronson

    Carl #21:

    Finally, the question: is it possible to conceive of an experiment that would clearly distinguish between “decoherence” and “collapse”?

    Unfortunately, I think the only possible such experiments would involve

    (1) taking some object that was previously considered to be a “conscious observer”—whether a human brain or (say) an artificially-intelligent computer, and then
    (2) showing that the object can be placed in a coherent superposition, where the different branches correspond to perceiving different measurement outcomes.

    If such an experiment were carried out, then any interpretation that insisted on collapse as an objective, physical process triggered by conscious observers would become untenable.

    Needless to say, we seem impossibly far from any such experiment at the moment. (The Zeilinger group has done the double-slit experiment with buckyballs, but brains are another matter! :-) )

    For any “smaller” experiment—such as the one you describe, involving a “measurement” performed by a few tens of atoms—a believer in objective collapse could simply say that the tens of atoms were nowhere close to “macroscopic” or “conscious,” and therefore no collapse was triggered. Such a believer would therefore make exactly the same predictions as the Many-Worlders, the Bohmians, and everybody else.

  • Tim Maudlin

    Scott Aronson #10
    Your description of the situation is perfectly accurate, but both philosophers and physicists who have tried to make sense of it come to the same conclusion: there is, as you say, a double game going on with respect to the treatment of the wave function, and in the end the account is incoherent.

    Here is Bell, in “Against ‘Measurement’”:
    “Now, while quite uncomfortable with the concept of ‘all known observables’, I am fully convinced of the practical elusiveness, even the absence FAPP, of interference between macroscopically different states. So let us go along with KG [Kurt Gottfried] and see where this leads. “…If we take advantage of the indistinguishability of of rho and rho* [i.e. the exact density matrix after interaction and the density matrix without any interaction terms] to say that rho* is the state of the system subsequent to measurement, the intuitive interpretation of Cm as a probability amplitude emerges without further ado. This is because Cm enters rho* only via |Cm|*squared, and the latter quantity appears in rho* in precisely the same manner as probabilities do in classical statistical physics..”[This is a quote from Gottfried: the similarities to Banks are manifest. Bell continues...]

    “I am quite puzzled by this. If one were not actually on the lookout for probabilities, I think the obvious interpretation of even rho* would be that the system is in a state in which the various Ψs somehow coexist:Ψ1Ψ1* and Ψ2Ψ2* and…[i.e. Bell notes the Many-Worlds character of this obvious interpretation. He continues....]

    “This is not at all probability interpretation, in which the different terms are seen as not a coexisting but as alternatives: Ψ1Ψ1* or Ψ2Ψ2* or…

    “The idea that the elimination of coherence, one way or another, implies the replacement of ‘and’ by ‘or’, is a very common one among solvers of the ‘measurement problem’. It has always puzzled me.”

    Bell goes on to discuss how inadequate this approach is…everyone should read this piece.

    As you (Scott) say, quite correctly, if one wants to regard the collapse as merely Bayesian conditionalization, there there has to be some fact that is being conditionalized on (e.g. the cat being either alive or dead), and that physical fact ought to be represented in the physics. If one regards the wavefunction as complete, there is no such fact. So one way forward is to reject the completeness of the wavefunction, as EPR suggested. But then one has to say what the physics is postulating.

    Banks’s discussion of non-locality in #16 above shows exactly this confusion. He seems to think that it is trivial: the spins of the two particles were created to be anti-correlated on earth, and all you do when you find the result on Pluto is conditionalize on a pre-existing fact: just like Bertlmann’s socks. But we all know (and most clearly from GHZ) that this cannot be what is going on in QM. This double-thinking it-is-but-it-isn’t-Many-Worlds creates a fog of confusion that obscures Bell’s sharp result.

  • Tim Maudlin

    Carl #16

    No, Bell nowhere assumes anything that can be called “realism” in his proof. The result is a mathematical theorem. If you think that the theorem employs a postulate that should be called “realism”, please identify it and justify the name. Bell shows that no local theory can reproduce the predictions of standard QM. He takes those predictions to include the claim that experiments have unique results, so there is a fact about what the correlations among results of distant experiments are. One can say that Many Worlds rejects this claim, but that of course lands us in a host of interpretive problems.

  • tom banks

    Carl,

    I don’t think you’re right about the weather equations re: Katrina. If I take the initial conditions way back at the time when it wasn’t clear if it would hit N.O. or Galveston then the probability for hitting Galveston doesn’t go to zero at the time it hits Katrina. What happens in actual weather forecasts is that we continually make new measurements and apply the condition probability rule to throw away those parts of the old predictions that said things that observation has proven wrong. In this way we get a distribution, which gradually zeroes in on N. O. . But if we take only the very early data, and don’t make any other observation until we ask which city got hit, then we have to make a sudden “non-local” contraction of the distribution from one that predicted a reasonable chance for both cities to be hit to one that predicts only N.O.

    To those of you who believe in a mystical “Reality” that does not consist in observation of macroscopic objects, then me illustrate briefly the concept of “unhappening”.
    The name is perhaps due to Susskind and Bousso, but I first heard the idea from Charles Bennett. Let’s think about writing the letter S on a piece of paper. We can say that “happened” and the letter S is “real”. Now, take a rock made out of anti-matter and (very quickly if you don’t want to burn your hand :) ) wrap the paper around it, as in the well known game. Observers far away see a big explosion, but more importantly, according to the laws of QM, all the information that we previously considered to be “the reality of the letter S on the piece of paper” is encoded in the quantum phase correlations between the states of the photons (for simplicity I’m
    assuming that the annihilation products are all photons) that are rushing out into space at the speed of light. The QM wave function of the paper has “undecohered” (I’m deliberately using an ugly phrase here) and the “reality of the letter S” is now a property of spooky correlations between photons in very different places.

    So I would claim that our intuitive notion of reality and of things definitely happening is based on our brains’ lack of experience with microphysical problems.

    Here’s my suggestion for how SOMEONE might come to a more intuitive understanding of QM than our own. Imagine the dreams of the quantum computing aficionadas, and those of the AI community are both realized, and we manage to build a quantum computer, which becomes self aware. Since its brain, unlike our own, uses quantum processes for computation, it MIGHT have some kind of “intuition” about QM that we can’t achieve. Imagine that we ask it to explain things to us, and get the answer, “It would be like trying to teach calculus to a dog”….

  • Carl

    #23 Scott: You are correct as far as something as complex as a brain goes.

    The suggestion I was clumsily trying for was that most (?) forms of objective collapse don’t give a preferred status to a conscious observer as a measurement (and those that do have even bigger issues…). Even as a thought experiment, this exercise forces them to face the question of what constitutes a “measurement” that causes “collapse”: 1000 atoms? 100? 10? As a thought experiment it is the natural extension of Schrodinger’s cat. As a practical experiment, it would be nice to show that collapse is simply a bad assumption.

    I’ve seen some very clever attempts to rescue collapse: one recently is the suggestion that collapse can happen spontaneously and randomly to any particle in a system, and then ripples through the rest of the system; the more particles you have, the sooner one of them will collapse. This is very clever… except that it begins from the assumption that there is such a thing as collapse that needs explaining.

  • Carl

    #26 Yes, your description is correct insofar as it applies to our knowledge of Katrina. My point is that as an analogy for how QM works, it’s confusing, because we know that the probability of hitting Galveston doesn’t suddenly collapse to zero, because in the macroscopic world there are inevitably other observers that we are ignoring for the sake of the analogy. Does that make more sense?

    As for the rest of your post, yes, I agree entirely. There is no reason to expect that we should be equipped to intuitively understand QM. The best we might be able to hope for is to understand a mathematical description of how it works. To extend your AI example, I recall reading (and I wish I could attribute this properly) a suggestion that it might be possible to put such a computer into a superposition of states for a period of time, and potentially be able to preserve the AI’s knowledge that it had been in such a state. You could then ask it how it “feels” to be in a superposition… but again, there is no guarantee you’d get back an answer that we could understand.

  • Don Page

    Tom Banks has given a nice discussion of probability and quantum mechanics (QM) but seems to have left open the question as to whether the quantum state (say in conjunction with some set of rules for how to interpret it) fully specifies our reality or not. One view is that QM assigns probabilities to mutually exclusive outcomes but that only one of the outcomes actually occurs, so the full specification of reality, including which outcome actually occurs, is not a consequence purely of the quantum state and of some set of rules. This view would seem to agree at least partially with what Einstein wrote in a 1945 letter to P. S. Epstein (quoted in M. F. Pusey, J. Barrett, and T. Rudolph, http://arxiv.org/abs/1111.3328, which Matt Leifer mentioned in the first response to Tom’s latest guest blog): “I incline to the opinion that the wave function does not (completely) describe what is real, but only a (to us) empirically accessible maximal knowledge regarding that which really exists … This is what I mean when I advance the view that quantum mechanics gives an incomplete description of the real state of affairs.”

    An alternative view, which I personally find more attractive, is that the quantum state (in conjunction with some rules) does fully specify the reality of our universe or multiverse (leaving aside God and other entities which He may have also created). In this case the probabilities the rules give from the quantum state must not be merely something like propensities for potentialities to be converted to actualities, with the question of which actuality in reality occurs not being fully specified by the quantum state. Rather, the quantum probabilities may be measures for multiple existing actualities, something like frequencies (but not necessarily restricted to be ratios of integers, i.e., to rational numbers).

    A broad framework in which this second viewpoint occurs is the many-worlds version of quantum theory, in which the quantum state never collapses but rather describes many actually existing Everett `worlds’ (each just part of the total reality, so that these are not the `worlds’ of philosophical discourse that denote possible states of affairs for the totality of reality). If one views the Everett worlds as making up a continuum with a normalized measure, so that there are an uncountably infinite number of worlds for each different outcome, then one can say that the quantum probability for a specific outcome is the fraction of the measure of the continuum with that outcome. This then allows the quantum probabilities to be interpreted rather as frequencies (though not necessarily as rational numbers) in an infinite ensemble of Everett worlds. This viewpoint would thus be a generalization of the frequentist view that Tom mentioned to the case in which the quantum theory provides its own infinite ensemble (the continuum of Everett many worlds), rather than requiring that “if we did the same measurement a large number of times, then the fraction of times or frequency with which we’d find a particular result would approach the probability of that result in the limit of an infinite number of trials,” as Tom nicely explained the more restricted interpretation of the frequentist view.

    Now I am NOT saying that the quantum state by itself logically implies the probabilities, or the measures for the continua of Everett many worlds in that framework. One could imagine a bare quantum framework, such as what the late Sidney Coleman advocated, in which there are just amplitudes and expectation values but no probabilities. However, to test theories against observations, it seems very useful to be able to hypothesize that there are probabilities for observations. Then we need rules for extracting these probabilities from the quantum state. In quantum cosmology, this is what I regard as the essence of the measure problem (which thus seems to me to be related to the measurement problem of quantum mechanics). I have shown in Insufficiency of the Quantum State for Deducing Observational Probabilities, The Born Rule Dies (published in Journal of Cosmology and Astroparticle Physics, JCAP07(2009)008, at http://iopscience.iop.org/1475-7516/2009/07/008/, as “The Born Rule Fails in Cosmology”), Born Again, and Born’s Rule Is Insufficient in a Large Universe that in a universe large enough for there to be copies of observations, the probabilities for observations cannot be expectation values of projection operators (a simple mathematization of Born’s rule that was stated more vaguely in Max Born’s Nobel-Prize winning paper). It is not yet clear what the replacement for Born’s rule should be. To me, the most conservative approach would be to say that the relative probability for each possible observation is the expectation value of some quantum operator associated with the corresponding observation, but what these operators are remains unknown.

    It used to be assumed that the complete set of dynamical laws for the universe (e.g., the full set of quantum operators and their algebra in a quantum framework) is the Theory of Everything, but, particularly with the brilliant work of James Hartle and Stephen Hawking in their proposal for the Wave Function of the Universe, published in Physical Review D 28, 2960-2975 (1983), available at http://prd.aps.org/abstract/PRD/v28/i12/p2960_1 (even if perhaps not quite right in detail; see Susskind’s challenge to the Hartle-Hawking no-boundary proposal and possible resolutions), it has been recognized that one also needs the quantum state. Now with the demonstrated failure of Born’s rule and the measure problem of cosmology, we realize that one also needs as-yet-unknown rules for extracting the probabilities of observations from the quantum state. However, using the words of Richard Feynman to me in 1986 about his former Ph.D. advisor John Wheeler on another issue, the “radical conservative” viewpoint on this issue is that the quantum state and the rules for extracting the probabilities of observations (interpreted as normalized measures for many actually existing observations) is a complete specification of the reality of our universe or multiverse.

  • tom banks

    I just happened to be looking through our department’s QM course website and found a slogan from my colleague Scott Thomas that crystallizes what I was trying to say about Reality:

    OBJECTIVE REALITY IS AN EMERGENT PHENOMENON

  • Tim Maudlin

    Tom, re. #26

    It would be very helpful if you could clarify what you intended with the example of the paper. You write ‘To those of you who believe in a mystical “Reality” that does not consist in observation of macroscopic objects, then me illustrate briefly the concept of “unhappening”.’ Why the word “mystical” here? I believe that the overwhelming majority of the physical world is not macroscopic and/or not observed. For example: the precise details of the convection in the Sun right now, the distribution of dark matter in the Milky way, the present weather patterns on the far side of Neptune, everything that happened in the first few nanoseconds after the Big Bang, the geometry of space-time at Planck scale. I believe that Socrates had a particular blood type, even though no ever did or ever will observe what it was. This is not “mystical” reality, it is physical reality: the very thing that physics is about.

    Now somehow, the fact that one can destroy a piece of paper is supposed to cast doubt on such reality. How can that be? If you wrote an S on the piece of paper, then of course one can say it “happened” and was “real”: that says no more nor less than that you wrote an S on the paper. If we destroy the paper with anti-matter (or just burn it), it may become practically impossible for distant observers, or historians, to tell what was written on it. So what? If the laws of physics are deterministic then the “information” about what was on the paper exists in principle at all times (more accurately: on every Cauchy surface) because the physical state on each such surface together with the laws imply the complete history of the universe. If the dynamical laws are not deterministic (as you seem to accept), then the “information” gets lost. But nonetheless, you wrote an S on the paper, whether distant observers or future historians can tell. What can this possibly have to do with the idea that “things definitely happen”. Lots of things—most things—that happen are never observed or known. If we did not believe that there were such things, what would we be building expensive telescopes for? There is a world out there that exists independently of being observed, and it is that world that physics is trying to understand.

  • Henry

    #26: Your unhappening example is bad as it is. Memory traces in the brain of the S writer provide the needed environmental decoherence even after the paper had been annihilated. But that is really easy to correct; take a larger lump of antimatter and annihilate the writer as well. :)

    #31: According to some interpretations of quantum mechanics, maybe Socrates had a superposition of blood types and that superposition never collapsed… (Then again, maybe Socrates was a fictitious character invented by Plato.)

  • Will

    A quantum formulation of a classical system will have classical superselection sectors, with one corresponding to each classical configuration. Any operators invented which mix superselection sectors is unphysical and should not be counted.

  • Tim Maudlin

    Henry #34
    Our job here is to figure out which interpretations of quantum mechanics are coherent or should be taken seriously. Do you think that such an interpretation, if it exists, should be a topic of serious discussion?

  • Tim Maudlin

    Will #33
    Superselection at best does the same job as decoherence…which is to say, it does not solve the measurement problem.

  • ursus_maritimus

    So do people agree with Tom that entanglement is in no way non-local and is actually just a trivial ‘readout’ of a result that was fixed as soon as the entangled particles were projected in different directions?

  • http://jbg.f2s.com/quantum2.txt James Gallagher

    ursus_maritimus #36 He doesn’t say the result was fixed he says the correlation was fixed at the time the entangled particles were created. There is no counterfactual definiteness here, the particles are in an up/down or down/up state but we can not know even in principle until a measurement is made on one of the particles. This is not the same as the naive Bertlmann socks situation where a counterfactually definite state of the socks exists before anyone looks at either of them

  • http://www.scottaaronson.com Scott Aaronson

    Tim Maudlin #34: I’m curious, which interpretations (if any) have you found coherent or worth taking seriously?

    (Personally, I’ve found pretty much every interpretation’s criticisms of the other interpretations to be extremely persuasive and cogent. :-) )

  • Doubter

    The other interpretation, often called Bayesian, is that probability gives a best guess at what the answer will be in any given trial.

    How do we test the claim of “best”? Can we do it without getting frequentist?

  • http://blogs.discovermagazine.com/cosmicvariance/sean/ Sean

    I don’t have much useful to contribute, but I can at least go on record with my current beliefs. I’m one of those barbarians who thinks the wave function is “real,” in any useful sense of the word “real.” And that all it does is to evolve unitarily via the Schrodinger equation. (The point of view that has been saddled with the label “many-worlds.”) Our job is to establish a convincing interpretation, i.e. to map the formalism of unitary QM onto the observed world. There are certainly difficulties here, and I’ll confess that I don’t really see how treating the wave function simply as a rule for predicting probabilities helps to solve them. (In particular, as Tim emphasizes, it seems to leave hanging the old Copenhagen problem of when wave functions collapse, which in this interpretation becomes “when we have made an observation over which we can conditionalize future predictions.”)

    One problem is why we observe certain states (live cat, dead cat) and not others (superposition of live cat + dead cat). Here I strongly think that a combination of decoherence plus the actual Hamiltonian of the world (in particular, locality in space) will provide an acceptable answer, even if I don’t think I could personally put all the pieces together. I’m more impressed by the problem in MWI about how to get out the Born rule for probabilities, which Tom seems to simply take as “what the wave function does.” I can’t swear that this attitude is somehow illegitimate, although I’d certainly like to do better.

    I thought the discussion of non-commuting observables in classical mechanics was actually the most interesting part of the post.

  • tom banks

    Tim

    I definitely believe in “a world out there”, but I think that our brains are incapable of coming to an intuitive understanding of it, by which I mean one which will satisfy your notion of Reality in which a thing either exists or doesn’t. The essence of QM is that even when every possible definite thing, which can be known at a given time, is known, then there are things which are uncertain. That’s what the math says. And we know that manipulating those mathematical rules and interpreting the things that look like probability distributions as probability distributions, we make predictions and give explanations about the micro-world which are in exquisite agreement with experiment. And “local realist” alternative explanations, as we all know, cannot agree with those experiments.

    I actually think the first part of my post is much more interesting than this retread of all the discussions of whether decoherence does or does not solve the measurement problem. It seems to me significant that you cannot write down the mathematical formulation of classical logic, which is the mode of thought motivated by the realist position, without automatically raising all of the issues of QM. They’re there in the math and the world uses them.

    For me, the philosophical conclusion from all of this is the one summarized in Scott Thomas’ slogan. Everything for which we can affirm the ideas of “reality” by experiment, is one of these big composite objects for which QM predictions are indistinguishable from those of classical statistics obeying the conditional probability rule. My conclusion from this is that because our brains evolved only to deal with such objects, we think of Objective Reality in the wrong way. The real rules of the world just won’t be fit in to that straitjacket. And if we can’t come to a satisfactory intuitive understanding of them, that’s our defect, not a defect of the rules. There’s no principle that guarantees that we can understand the microscopic rules of the world. Indeed, if one believes in the evolutionary story of the origin of our consciousness (I call it only a story, not because I disagree with the overwhelming evidence for evolution, but because I think we don’t understand consciousness well enough to know if that story is not missing some important ingredient), then it seems incredible that we understand as much as we do.

    By the way, you may not know this, but one of the predictions of a universe with a positive cosmological constant, as ours seems to have, is that the Sombrero galaxy will eventually pass out of our cosmological horizon, never to be seen again (actually, just like an object falling into a black hole, we will never see it leave the horizon, but the signals from it will be redshifted into oblivion. According to the Holographic Principle and the Principle of Black Hole Complementarity, this is to be interpreted, from our point of view, as the galaxy being absorbed into a huge thermal system living on the horizon. On the other hand, from the Sombrerinos point of view, they go on their merry way and WE are absorbed into a thermal system on their horizon. The two points of view are Complementary, meaning that macroscopic observations by the Sombrerinos do not commute with our own. It’s perfectly consistent, because we’re using different notions of time, and any experiment we can do, which would bring back data proving that they actually burned up, has the effect of burning them up. These theoretical ideas have not yet
    been subjected to experimental test, but at least my version of them leads to experimental consequences, which are probably falsifiable by the LHC.

    This may help you understand why I think Objective Reality is overrated…

  • Boaz

    It seems that the upshot of this post is that quantum mechanics is what you get when you have some observable things in a system that don’t commute- i.e. it matters in which order you measure them, or something along those lines. From this perspective, the problem is shifted to asking about which things have this property of having non-commuting observables. We already know that non-relativistic quantum mechanics isn’t an exact framework to describe everything that happens in the universe (consider relativity, for example). So this somehow says- there are things out there with non-commuting observables. And quantum mechanics is the logical framework to describe such things.

    It’d be interesting to try to formalize the example of the hurricane, or something like it, and to see what all the odd things about quantum mechanics would say in that case. And if there are such examples- (maybe from finance, or other fields as well?) then perhaps by building up a repertoire of such examples we could approach an intuitive understanding of quantum mechanics.

  • Tim Maudlin

    Scott # 37
    Bell himself promoted both the Bohmian approach and the GRW collapse approach as theories where all of the physics is in the equations (rather than in some vague surrounding talk) that can demonstrably recover all verified predictions of non-relativistic QM. They both postulate a physical world that evolves irrespective of “measurement” or “observation”. There are extensions of both to cover field theory/particle creation and annihilation. These are theories that provide clear, exact answers to any clearly stated physical question. I would be happy to consider any theory that is equally precisely formulated. I have problems with Many Worlds concerning the interpretation of probability, and connection to experimental outcomes.

    No doubt, all of these theories have unpleasant aspects. But the are clear and consistent, and remind you what what a clear physics can be. And they are ongoing projects, being expanded and refined.

  • ursus_maritimus

    #37 James

    Then how does the entangled particle come to ‘know’ what state to be in (up or down) ‘instantaneously’ once the observer makes the measurement on its partner that allows us to know in principle?

  • Ray Gedaly

    Professor Hillel is said to have been asked by a student to explain his Quantum Physics 101 textbook while standing on one foot. Hillel, standing on one foot, replied to the student “Quantum Theory basically explains that some things cannot be measured precisely and thus nothing is certain; the rest of the book is just commentary; now go home and do your homework problems.”

  • Rick

    Wow!! What a great discussion.

  • Tim Maudlin

    Tom # 41;

    Thanks for replying, but I’m afraid I still don’t see the point of the example with the note. If you destroy the note, and the physics is deterministic, then the “information” about what was on the note still exists but is, as it were, widely scattered. Why think that undermines any interesting physical view?

    Classical mechanics, of course, is not formulated in terms of what anyone knows. The complete physical state of a system is specified by a point in phase space, and under the usual Hamiltonian laws of motion that phase point evolves deterministically. Given that information, and the exact physical specification of any possible physical interaction of this system with another, the physics yields a perfectly determinate outcome. So there is no intrinsic “uncertainty” or “probability” in classical physics with respect to the outcome of any exactly specified physical situation. One could, in principle, ask whether there are physical limits that arise in classical mechanics concerning how much one could know about a system, but nothing in your post suggests such limits. Meaningful physical questions are about the outcomes of experiments, not about abstract “observables” cooked up in the math. And in classical physics, if you know all that can be known about a system (the exact phase point) and all that can be known about the experimental situation (the exact phase point), then there will be a unique, exact prediction of the outcome of the experiment. If you disagree with this, please describe a physical experiment in a classical setting where it fails.

  • ursus_maritimus

    #41

    Well if the most talked about preprint on the internet is to be taken as true, belief that the wave function is real and not a handy statistical abstraction will soon hardly be thought of as ‘barbarian’.

  • http://jbg.f2s.com/quantum2.txt James Gallagher

    ursus_maritimus #44

    Ay, there’s the rub. :-)

    We don’t know how, we just know that this is the way nature works as has been demonstrated in many experiments by Clauser, Aspect, Zeilinger et al.

    Of course, we can make suggestions, but if your suggestion requires both locality and counterfactual definiteness to be true then it does not describe nature.

    Nature doesn’t really care if us humans find her difficult to comprehend

  • phayes

    “I’m one of those barbarians who thinks the wave function is “real,” in any useful sense of the word “real.” And that all it does is to evolve unitarily via the Schrodinger equation. (The point of view that has been saddled with the label “many-worlds.”) Our job is to establish a convincing interpretation, i.e. to map the formalism of unitary QM onto the observed world.”

    So, barbarian, where can I buy a wave function / probability meter? ;-) Although I’d agree with Tom in that I think Jaynes was wrong to assert – as he did – that there *must* be some discernible ‘reality’ underlying QM in order to avoid the “probability / wave function is ‘real’” mind projection fallacy, mind projection fallacy it surely is. There are at least as many ways (Aerts, Rovelli, De Muynck…) out of that rabbit hole as there are out of the “FTL neutrino would destroy SR and causality” rabbit hole.

  • Jeff

    Ursus, what preprint is this? Maybe I’m the barbarian for missing it…?

  • Jeff

    Also, Tom in #14 I’m afraid I still don’t understand what happens in the decoherence picture. Alice measures Particle A, and gets the answer |+>. No matter what, if Bob measures Particle B in the same basis, he must get |->. This is only possible if Particle B *was in |-> state* at the time he measured it. Thus Alice collapsed Bob’s particle to that state.

    It doesn’t matter who collapsed it first, but once it is collapsed, *it is collapsed*. Why does discussing the energy microstates of the needle help? I don’t know what you would call this interpretation, but to me the wavefunction is a real thing that collapses instantaneously. Just because that collapse doesn’t transfer any actual information doesn’t mean it isn’t real. The correlations seen in Bell tests, and the loophole tests, seem to show that it collapses, or effectively collapses.

    Loving this discussion, by the way!

  • Doubter

    Note that if we think of the probability function P(x,t) for the hurricane to hit a point x and time t to be a physical field, then this procedure seems non-local or a-causal. The field changes instantaneously to zero at Galveston as soon as we make a measurement in New Orleans. Furthermore, our procedure “violates the weather equations”. Weather evolution seems to have two kinds of dynamics. The deterministic, local, evolution of P(x,t) given by the equation, and the causality violating projection of the probability of Galveston to zero and rescaling of the probability of New Orleans to one, which is mysteriously caused by the measurement process. Recognizing P to be a probability, rather than a physical field, shows that these objections are silly.

    My problem with the paragraph quoted here is that the weather equations are not expected to also describe the measurement and rescaling of probabilities. However, Quantum Mechanics, as a complete description of nature is supposed to describe the quantum system as well as the measurement process and the rescaling of probabilities. I suppose the claim is that later on, decoherence and environment-induced superselection (I believe both are necessary) explains how this happens quantum mechanically.

    But at this point in the discussion, “these objections are silly” is not justified.

  • Doubter

    In the classical B,Q ammonia molecule system, how do I measure B? How does the measurement of B affect Q ( not at all?). Isn’t this is rather different from quantum mechanics?

  • Randy

    One thing that Tom neglected to mention in his discussion of the exponential size of configuration state space, is that you actually have to create a state space of this exponential size to calculate results with quantum mechanics. How does nature manage to pull off this trick, without also using the informational equivalent of such a massive state space? People wave their hands and hope for quantum computers to perform this magic for them, but to me this indicates that the theory is fundamentally broken as a physical model, right from the start. You don’t even need to talk about EPR or anything fancy to see this. This simply cannot be a description of how nature actually operates, because nature wouldn’t have room for all the information required to perform the necessary calculations (we’re talking 10^(10^80) for all the atoms in the visible universe, for example).

    One can trace this problem to the linear nature of the Schrodinger equation: the only way to capture interactions with a linear function is through brute force exponential tensor product space. Yet we know that the Schrodinger model neglects the radiation reaction of electrons onto themselves, which introduces an important source of nonlinearity. Nonlinear systems are much more difficult to analyze, but there is really no justification for thinking you can use a linear equation to model electrons or any other charged particle accurately.

    The only sensible conclusion is that, consistent with the main article, this whole edifice is indeed a probability calculus that happens to produce reasonably accurate results, as long as you don’t need the full power of QED, which does model the radiation reaction, but is much more unwieldy compared to the standard QM model — somehow it is always neglected entirely such discussions.

    It just seems obvious that, as Einstein believed, there is a more complete, complex, nonlinear model that explains what is actually going on in nature, which happens to reduce to the standard QM probability calculus results under some appropriate simplifying assumptions. At the very least, there is absolutely no reason to think that standard QM accurately describes known physical processes – QED tells us that it manifestly does not. Why does everyone seem to think that it should?

  • Boaz

    Realized I misread the hurricane example. Its not an example that is the same as QM- but gives a different perspective on collapse of the wave function. So QM results for probabilistic processes with non-commuting observables. I wonder if one can cook up an example in a completely different realm than micro-physics which would have these properties. Like something in the realm of relationships or communication. Does person A love person B for example? If person B tries to determine the answer to this question, it may well affect the outcome. Is there some other less loaded subject (and more able to be made precise) where we might find “observables” that are intrinsically affected by measurement, and then assuming a probabilistic framework can be modeled by quantum mechanics?

  • Tom Banks

    Sean asked me to comment on this quote from Tim’s #24

    “if one wants to regard
    the collapse as merely Bayesian conditionalization, there there has to
    be some fact that is being conditionalized on (e.g. the cat being
    either alive or dead), and that physical fact ought to be represented
    in the physics”

    The last phrase here is the key difference between Tim and myself. Tim is trying to use probability in its classical sense i.e. that there are “facts” which exist independent of measurement. I don’t like the word measurement because it implies an intelligent experimenter. For me, any change in what I’ve called a macroscopic quantity is a measurement of something else, which might be another macroscopic quantity,
    My point of view is that without such macroscopic changes there doesn’t seem to be any object in the real world that corresponds to what realists want to call facts.

    Again, I think it’s extremely important to distinguish between THE REAL WORLD, which in agreement with the realists I think is something that’s out there, independent of our mental processes, and our theory of prediction in the real world. Making the assumption that everything in THE REAL WORLD can be represented by a “fact” in the realist sense contradicts the formalism of quantum mechanics. I’ve argued that if you set up the most general mathematical theory of such facts i.e. mathematical logic, then it also has in it other objects, which can’t be considered facts in this sense. Accepting this requires us to rethink what we mean by probability. To me, Tim’s comment reflects only the original meaning of probability, which was used to talk about an hypothetical world where everything was a fact, but we were merely ignorant of some of the facts.

    I’d like to hear a realist comment on a fantasy world in which the fundamental equation of physics was the Langevin equation with a Gaussian random force (say a white noise spectrum, it doesn’t really matter). Langevin interpreted his random force as due to the effect of facts we couldn’t measure, namely random collisions with atoms in a classical model of atoms. I want you to instead assume that experimental investigation has shown that there are no atoms. There’s no mechanical, realist, hidden variable explanation of the random force term.

    The mathematical solution of this model is the same as the one in which we postulate that the force comes from atoms, and let us assume it has perfect agreement with experiment, but experiment has shown that there are no atoms, and a bold physicist just says. “Tough, girls and guys, this is the way the world works. The fundamental equations of physics are random.” And let me be clear. I’m not suggesting that there’s some particular fixed time dependent force, which is chosen from the Gaussian probability distribution. I’m suggesting that the correct mathematical procedure for reproducing the results of all experiments is to solve Newton’s equations with a random force and average over the distribution.

    I believe that such a theory would be just as objectionable to a realist as my statements about QM. In the fantasy world there would be endless searches for the “source of the randomness” and a complaint that such a theory had no physical facts in it. But I want you to imagine that all such searches had a null result.

    QM is much nicer than this, because, with hindsight, we can see that its inevitable randomness is forced on us by mathematical structure that we can’t deny. In Koopman’s formulation of classical mechanics we can define probability distributions for an infinite set of quantities which don’t commute with position and momentum. Every assumed state of the classical system defines these distributions, and their time evolution can be extracted from Schrodinger’s equation.

    When Newton et. al. did CM, they were just exploiting the very particular form of the Koopman-Liouville Hamiltonian, which guarantees that the ordinary position and momentum satisfy equations that are independent of the values of all these other observables. But if you pay attention to them, the theory has all of the conceptual subtlety of QM, because it’s just a special form of QM. The analog in our model ammonia molecule is dynamics in discrete time, which simply flips or doesn’t flip s_z from + to -, according to some rule at each time.

    So to give the final answer to Tim, I’m conditioning on the life or death of the cat, which, since the cat is a macro=object IS a fact. Fact is an approximate concept. Objective Reality is an Emergent phenomenon. It’s hard for us to accept because our brains weren’t built to understand it.

  • David Brown

    To all those who uphold quantum theory with the infinite nature hypothesis, I ask 2 questions: What is the physical meaning of the monster group? Are there 6 quarks because there are 6 pariah groups?

  • Doubter

    arxiv:0312059v4 tells us that there is a theorem by Zurek, 1981, to the effect:
    Quote:
    “When quantum mechanics is applied to an isolated composite object consisting of a system S and an apparatus A, it cannot determine which observable of the system has been measured.”

  • ursus_maritimus
  • CSCO

    Tom,
    I think it is worth mentioning that we can not escape the inevitability of the concept of the state. In non-relativistic QM we can readily identify the complete set of commuting observables, which can then be used to define an objects state.

    What is more interesting is how this is reflected in the model of the hydrogen atom. A state vector is a continuous function that resides in hilbert space. It is the quantum numbers (the CSCO) that define the apparent shape of this function that we understand as a “real” spatial distribution for the electron clouds.

    That reality just tells us where we are likely to witness a photon interact with an electron that has certain quantum numbers. Are the quantum numbers real? We can understand those numbers as being conserved in non-relativistic QM, so they represent information that is somehow stored.

    At anyone time though I only have a certain amount of information, and that information can only be consistent with certain states. What we have found out via Bell’s Inequalities, is that all states that are consistent with the information we possess must be evolving in an entangled way. Our observation of the “real” state will provide us with a consistent result, but there is nothing a prior to one’s own observation that will give that state a particular preference over any other.

    The possibility that some hidden observer might claim that they were the ones that caused some particular reality before our own observation can only be viewed as an a posterior result that was part of the original set of observational outcomes. Meaning I can not provide any a prior preference to a hidden observer’s view of the world (except to the fact that their observed world will always be perfectly consistent to ones own, once all the information is compared). IOW, it is entirely possible that there is a consistent observation made where there is no hidden observer making wild claims of being the “real” observer.

    What is interesting is that this places some weight on the question of free will and choice. To some degree, the world one sees is a “real” manifestation of the choices an observer makes. If we accept that a person’s will is not random (although it should be noted that we can see the choices of a large number of people as at least chaotic, if not purely random), then we have to give weight to the idea that whether a person knows if the cat is dead or not depends on the choice of opening the box. If one chooses not to open the box, then they can be rest assured that there is always some possibility that the cat is alive.

    What this all really comes down to is how one manages their time and choices.

  • http://jbg.f2s.com/quantum2.txt James Gallagher

    #55 Randy

    QED is consistent with the Feynman Path Integral formulation of QM which is mathematically equivalent to the Schrödinger “picture” of evolution of the State Vector. The reason we don’t use the Schrödinger “picture” in QED (or any other effective QFT) is that it is impractical to construct/solve the Hamiltonian. This is even difficult in simpler perturbation scenarios with non-relativistic Hamiltonians eg see http://en.wikipedia.org/wiki/Perturbation_theory_%28quantum_mechanics%29

    However, in principle, if we could determine the correct (Hilbert) state space and Hamiltonian then QED would be described by the Evolution of a probabilistic State Vector just like in non-relativistic QM.

  • Frank Martin DiMeglio

    Intelligibility and reasonableness are lacking in quantum mechanics. The fundamental question is how gravity and inertia fundamentally relate to distance in/of space. Both of these forces must be at half strength/force. As the philosopher Berkeley said, the purpose of vision is to advise of the consequences of touch in time. Invisible and visible space must be balanced, and instantaneity must also be part of the true unification of physics (of quantum gravity, electromagnetism, gravity, and inertia). Opposites must be combined and included.

  • Tim Maudlin

    Tom # 57

    We are definitely making some progress in clearing up misunderstandings. If I follow correctly, you have misunderstood the view you are calling “realism”. I take it that, for example, you might as well use Nelson’s stochastic mechanics as your example: there is a fundamentally indeterministic dynamics, not underpinned by any deterministic theory. To my knowledge, absolutely no one who calls themselves a realist, or has trouble with the sorts of claims that are being made here about QM, has any objection or problem at all in such a case. The GRW theory, of course, has an irreducibly random dynamics, and even people working on Bohmian approaches have used irreducible random dynamics. So when you say “I believe that such a theory would be just as objectionable to a realist as my statements about QM”, the statement is just flatly false, and indicates that there has been a fundamental misunderstanding. In stochastic mechanics, even if the dynamics is not deterministic, there is a quite precise set of trajectories that the particles actually follow in any given case, and all macroscopic facts are determined straightforwardly by these microscopic facts: whether the cat in fact ends up alive or dead is a relatively simple function of what the individual particles in the cat do. So it is not true that randomness or indeterminism is inconsistent with what is called “realism”. It is also not true that any indeterministic theory will display the sort of non-locality the Bell worried about: your Langevin equation world will not allow for violations of Bell’s inequality…or in any case, it is trivial to specify local indeterministic theories that do not allow for for violation of the inequalities. So the diagnosis is that somehow realists can’t handle a fundamental failure of determinism is incorrect. But we would like it that the cat ending up alive or dead, which it does, be explicated in terms of what the particles in the cat do, since the cat is nothing more than the sum of its particles. If you say that the individual particles do not end up anywhere in particular, for example, then neither does the cat. So the it can’t be that the only reality is macroscopic reality.

  • Moshe

    To Tim, and other proponents of a realist stance: perhaps it useful for the rest of us to get a more precise idea of what you mean by the predicate “exist” or the noun “reality”. Tom expresses here very precisely a view that many physicists, like myself, have as a gut feeling. Namely, that any attempt to formalize these notions would necessarily have to rely on some features of classical physics (Tom has identified concretely which features those might be). If so, any attempt to have a “realist interpretation” is secretly an attempt to reduce the quantum to the classical (perhaps with a stochastic element). Do you have any definition of “realism” not relying on the features Tom identifies?

  • Frank Martin DiMeglio

    The discussion of the fullness of reality should include/address the following. Instantaneity requires that larger and smaller space (invisible and visible space, and inertia and gravity) be combined, balanced, and included. The balancing/unification of quantum gravity, inertia, electromagnetism, and gravity requires this. Also see my prior post (63) please.

  • TSny

    I’m trying to understand in what sense the operator sx corresponds to an “observable” in the classical scenario. Dr. Banks defines it as the *operator* that moves the N atom to the other side of the plane of H atoms (or flips the molecule). What is the operational procedure we would go through to “measure” this “observable”?

    If we define the “quantity” B to be just sx, then according to the probability expression stated in the post, P(B) = ½ for “measuring” B to have a value of +1 (or -1). What does this mean physically in the classical scenario?

    Also, can anyone give me a good reference for Koopman’s formulation of classical mechanics?

  • Pingback: Guest Post: David Wallace on the Physicality of the Quantum State | Cosmic Variance | Discover Magazine

  • Pingback: Interpretation of Quantum Mechanics in the News | Quantum Mechanics Blog

  • Doubter

    This note says superpositions are not possible in classical mechanics, even with Koopman’s tricks: http://prl.aps.org/abstract/PRL/v105/i15/e150604

    Arxiv: http://arxiv.org/abs/1006.3029

  • Randy

    #62 Gallagher:

    But isn’t it the very nonlinearity of the self-field coupling (i.e., virtual particle loops) in QED that make it impossible to use in the standard QM picture? Seems to me that this “impracticality” may be pointing to the deeper issue that the standard Hilbert space depends on linearity and nature just doesn’t work that way. It is a polite fiction, which seems to be leading us astray?

  • http://jbg.f2s.com/quantum2.txt James Gallagher

    #71 Randy

    the nonlinearities are emergent from the (complicated) linear evolution. Otherwise you would have a simple argument to debunk MWI for example.

  • Tim Maudlin

    Moshe # 65
    I think that this discussion of the terms “real” and “exist”, and even the term “realist” is rather backwards. It is not that “realists” think that these are interesting terms in need of some fancy definition. “Real” and “exist” are typically redundant: thus “I own a car”, “I own a real car”, “I own a car that really exists”, “I, who exist, really own a real car that really exists in reality” all say exactly the same same thing. As a philosopher says, they have the same truth conditions. And “I own a car” does not use the terms “real” or “exist”, so they play no substantial role.

    What “realists” want is simple: they want a physical theory to state clearly an unambiguously exactly what it postulates, and the laws (deterministic or probabilistic) that describe how what it postulates behaves. Anything short of that just isn’t a physical theory. It may be good advice about how to bet in certain circumstances (in which case it at least postulates the existence of the circumstance!), but it isn’t a physical theory of those things.

    There are cats. Cats are real. Real cats exists. These all say the same thing. Cats are macroscopic items: you can see them with your naked eye. They are made up out of microscopic parts. All the “realist” wants to to know what those parts are, and how they behave. If you say: “cats exist, but the ‘emerge’ out of things that don’t exist”, that makes no sense: nothing can “emerge” out of what doesn’t exist, since there is nothing to emerge out of.

    Banks’s view seems to be this: there is some microscopic reality, some small parts that make up cats, but we a incapable of comprehending them. That is a possible view, but I fail to see any argument that suggests it is correct. There certainly are clear, comprehensible physical theories that predict exactly the behavior at a macroscopic scale that is predicted by quantum theory. So there is nothing in the empirical predictions that rules out a comprehensible account of the physical world.

  • Moshe

    Tim, thanks for your response. We might have reached a point of mutual incomprehension (not uncommon in this medium), but let me try one more iteration.

    To understand where I am coming from let me make an analogy to the crisis in the foundations of mathematics in the early 20th century. In that case people asked questions (about the set of all sets and so forth) that made complete intuitive sense at the level of natural language, but did not have self-consistent answers. This was resolved only after a systematic effort to formalize the foundations of the subject. Part of the result was the clarification of which natural language questions were “askable”, many of them aren’t.

    Keeping with this analogy, and given that quantum mechanics (as far as we know) is a precise, consistent and complete mathematical framework, the natural reaction to a set of confusing natural language questions is that these may not be “askable” in the same sense — they certainly don’t seem to be well-formed questions within the framework of conventional quantum mechanics (for reasons that Tom points out). Maybe that is reason to doubt conventional QM, but I am wondering if anything short of return to some form of classical mechanics can be deemed to be “realist”.

  • Tim Maudlin

    Moshe

    I don’t think that the foundational issues in set theory were resolved in the way you suggest. The problem wasn’t in natural language or in what questions “can be asked”: the problem was that the axioms of naive set theory were demonstrably inconsistent (implied a contradiction). And the fix was not to somehow restrict the language, much less to revise the logic used in the derivation: the fix was to change the axioms (to, for example, ZFC). I don’t know what questions in set theory you think are not “askable”. An example here would help.

    Quantum mechanics uses mathematics, and in many cases (though not all: for example saying that collapse occurs when a measurement occurs) the way to use the mathematics is precise. But neither quantum mechanics, not any other physical theory, is just a piece of mathematics. The mathematics is supposed to be used to describe or represent some physical reality, and one want to know how that is done.

    For example, the EPR paper asks a simple question about the mathematical wavefunction used in QM: is it complete, that is, does specifying the wavefunction specify (one way or another) all of the physical characteristics of a system? Einstein et. al argued no, in which it follows that a complete physical specification of system requires more variables (usually incorrectly called “hidden variables”). The mainstream of physics rejected that idea, in part mistakenly believing that von Neumann had proven it not to be possible. So this is a clear question to ask of any physical understanding of the mathematical quantum formalism: is the wavefunction, according to this understanding, a complete physical description or not? If you can’t clearly answer this question, then you have not got a clear physical theory.

    It seems to me that Banks is committed to saying the that wavefunction is not complete. It is only such a view that can make sense of the “collapse” of the wavefunction as merely Bayesian conditionalization. That is, if the wavefunction of a system does not completely specify its physical state, then it is possible to find out new information about the physical state even given the wavefunction, and then to update your information about the system. But if the wavefunction is complete, and you already know what it is (e.g. for an pair of electrons in the singlet state), then there just isn’t anything more to know, and changing the representation of the wavefunction by collapse must indicate and actual physical change in the system. But if the wavefunction is not complete, and there is more to the physical state of a pair of particles in an singlet state than can be derived from the singlet state, then as a matter of physics we want to know what these additional physical degrees of freedom are and how they behave.

    Studying the mathematics used to represent the wavefunction simply does not address any of these physical questions. It is not the mathematics per se but the use of the mathematics in the service of physics that is in questions. So we might begin here: take a pair of electrons in an singlet state and send one off to Pluto. We know what the wavefunction will be when it gets to Pluto, and that (e.g.) the wavefunction does not attribute a particular spin in any direction to either of the particles. So: do you think this is a complete physical description, and the electron has no definite spin, or not a complete description? Only with a clear answer to such a question can we begin to see how you are using the mathematical formalism to do physics.

  • Moshe

    Tim, instead of directly answering all your questions, let me highlight two points which I think are important to clarify how our perspectives differ.

    On the question of completeness: my personal answer would be that the wavefunction is as complete a description of a system as can possibly exist (assuming that conventional QM is correct). It is not complete in the sense of determining the results of every experiment — such a level of determinism is only possible in classical mechanics. But it does tell you the probability distribution for any potential measurement, which is all you can hope for. So, again my basic doubt is whether or not anything you may call “clear physical theory” or “realism” is necessarily classical.

    I think it is also interesting to highlight the differences in attitude towards mathematics. I agree that results of experiments are just one tool to infer some representation of reality which is consistent and detailed enough to be called an “explanation”. But, in my mind, the best formulation of such explanations is in terms of a formal system, a language both general and precise enough to do justice to that term. So formal language is not just a tool, it is the essence of what I’d call an explanation, and natural language translations are usually just a faint shadow of the real thing.

  • Moshe

    As for set theory, one example could be Russell’s paradox, the question regarding the set of all sets that do not contain themselves as an element. Does it contain itself as an element? The response to this paradox was to formalize “naive” set theory (naive because informal). In the formalized system the problem is not resolved by the question receiving a new insightful answer, it is resolved because the question is discarded as one that cannot be properly formulated within the system. One potential response to the questions of existence and completeness of states in QM is that they are resolved similarly, they are pseudo-questions that cannot be formulated precisely in the relevant formal system.

  • SupremeFunky

    Before an electron is observed, what is it that exists?

  • Michael Kagalenko

    When in #64 Tim Maudlin states
    > But we would like it that the cat ending up alive or dead, which it does, be explicated in
    > terms of what the particles in the cat do, since the cat is nothing more than the sum of its
    > particles.

    And especially:

    > If you say that the individual particles do not end up anywhere in particular, for
    > example, then neither does the cat.

    - That seems to me an attempt to smuggle in classical physics. In QM, an object is more than aggregation of its parts, and it can be in definite state, when none of its constituent parts are.

    (disclaimer – I am not a practicing physicist, and if I am misunderstanding something, I hope that I will be corrected)

    Also, thanks to Tom Banks for enlightening post and discussion.

  • Pingback: Mitsubishi Evo X Modified Is Is Cool Choice Of The Youngsters | Info About Cars

  • Tim Maudlin

    Moshe # 77:

    No, in ZFC there just is no “set of all sets”, and so one can’t get a “set of all sets that Phi” by the comprehension axiom. That is, according to the theory, there just is no object of the sort you are asking about. If one accepts ZFC as the correct understanding of set theory, then there is a clear account of why no such object exists. And seeing why there is no such thing is perfectly insightful.

    You seem to be suggesting that “Is the wavefunction complete?” is similarly a question that employs some suspect terminology. So: which terminology? Certainly, the claim that collapse of the wavefunction is mere conditionalization and not a physical change, as Banks insists, seem to imply that the wavefunction is not complete. Banks certainly thinks you can talk about the mathematical wavefunction, so that’s not the issue. Is the term “complete” obscure? I gave a definition above…any problem with it?

  • Doubter

    Some naive questions -
    1. How does a quantum field theory, with its infinite degrees of freedom, avoid decohering rapidly?

    2. Maxwell’s equations with the classical E and B fields arise when we have a large number of photons. What is the role of decoherence in our measurements of these fields? How do I understand a laser beam from the purely instrumentalist point of view?

  • Pingback: Guest Post: David Wallace on the Physicality of the Quantum State | Cosmic Variance | Theoretical Physics

  • Moshe

    Tim, we now seem to agree on the analogy. The suspect terminology is “real”, what does it mean? Can you define it in ways that do not presume the world to be classical?

    As for completeness, it seems to me that what you call a complete description would assign probabilities (or even truth values) to definite statements about the system, even when it is not measured. This is (unlike naive set theory) perfectly self-consistent, it just doesn’t seem consistent with QM, for the reasons Tom so clearly discusses. So, again I am wondering if the project of getting a good interpretation is distinct from the project of recasting QM in classical terms.

  • Moshe

    Tim, I appreciate the time you are taking to communicate . If at any point you find this process too time consuming (work week is about to start), perhaps you can direct me to some literature. I think by now it might be clear what I am worried about, which is probably not an unusual attitude among physicists, so perhaps there is some recent written resource somewhere. Reading has probably higher rate of information transfer than blog comments.

  • Doubter

    Does the quantum state of dark matter decohere?

  • steven johnson

    It is no doubt presumptuous to comment, especially at so late a date. Still,
    I have no idea what the time evolution of a probability distribution could possibly mean. Doesn’t the QM formalism incorporates a Hilbert space, which is not space as we understand it, nor does the time reversibility of the formalism permit us to understand what a probability distribution is evolving through. Maybe interpreting QM as referring to a spacetime continuum is in itself adding additional postulates that complete the system?

    I don’t see how Prof. Banks’ blog really addresses the distinction between macroscopic and microscopic. Aren’t phenomena like lasers and superconductivity quantum phenomena? They seem macroscopic to me. Nor do I see how the universe could decohere from its quantum state early in its cosmological history. Decoherence seems to assume some sort of classical objects or an automatic disentanglement (avoiding the dread word “collapse”.) And how does the QM formalism, with its time reversibility, ever find a unique past, something that would presumably be as impossible as the old-fashioned LaPlace style determined future?

    I’m sorry to say thinking science is an algorithm for making predictions seems obtuse. I do wonder why Prof. Banks expects taxpayers to pay for physicists to play with expensive equipment if they are merely refining their predictions for digital readouts.

  • jim kowall

    Dear Sean

    I very much enjoyed the recent blogs by Tom Banks on Eternal Inflation and Quantum Theory. Based on my understanding of what he has to say, and the reaction of the other readers, I’ve concluded that he is suggesting a radical departure in the way we understand reality. It seems the way he understands reality is inherent in his theory of Holographic Space Time. As I understand this theory, it is an observer-centric way of understanding the physical world.

    I’d like to try to simplify his ideas and ask for his reaction (maybe in another blog) where he can either further comment on these naïve ideas, or shoot them down as he wishes. I’d like to take the idea of an observer-centric world to its logical conclusion. The controversial aspect of this idea is that the observer is no longer identified with the nature of anything that the observer can observe in that world. For the purpose of scientific hypothesis, let’s just assume the observer exists as pure consciousness, whatever that means.

    The first thing is to define what we mean by an observer-centric world. It seems to me that Tom’s theory of Holographic Space Time answers this question. If we follow the line of reasoning of the equivalence principle, we can say the observer is present at a point of view, and that point of view follows a time-like trajectory in space-time. As is well known, if that trajectory through space-time defines an accelerating frame of reference, that observer always observes an event horizon, which is as far as that observer can see things in space due to the constancy of the speed of light. Everything beyond the event horizon is hidden from the observer. Along the lines of the holographic principle, that event horizon acts like a holographic viewing screen. Due to quantum uncertainty (in the sense of QFT), as virtual particle-antiparticle pairs spontaneously arise in empty space, some of the virtual pairs appear to separate at the horizon, which is the nature of Hawking radiation. In the sense of the holographic principle, separation of virtual pairs at the horizon is equivalent to encoding of information on the horizon. The horizon acts like a holographic viewing screen that projects observable images to the point of view of the observer. We can think of that point of view as a focal point of perception, along the lines of the projection and focusing theorems discussed by Bousso. Each fundamental pixel on the viewing screen encodes a quantized bit of information. I’ll leave it to Tom to explain exactly how those bits of information are encoded on the screen. The important point is the images that are projected from the screen are some form of coherently organized information. In some sense, coherent organization of information is the only reason why those forms are self-replicated in form over a sequence of events. Coherent organization has something to do with why the viewing screen is holographic in nature, and why those forms appear three dimensional. In this observer-centric view of reality, every viewing screen defines a world that is observed from the point of view of an observer. That world only arises because an event horizon arises from the point of view of an observer that is in an accelerating frame of reference. The event horizon acts like a holographic viewing screen that displays an entire world.

    In this observer-centric world, the viewing screen defines a state of information, with one fundamental quantized bit of information encoded per fundamental pixel on the screen. That state of information defines an entire world that only arises from the point of view of the observer of that world. The viewing screen displays an entire world. That world only appears from the point of view of the observer. That state of information is defined by the way bits of information are encoded on the pixels. In the sense of quantum theory, every event is a decision point where the quantum state of that world branches, due to all the different ways in which bits of information can become encoded on all the pixels of the viewing screen. Coherent organization of information allows for the development of observable forms of information, which self-replicate in form over a sequence of events, and are the nature of the observable images projected from the viewing screen to the observer at the central point of view. In the sense of an animation of images, the behaviors of those self-replicating forms are enacted over a sequence of events in the flow of energy. Thermodynamics allows us to understand the nature of that flow of energy (and the flow of time), in the sense that energy tends to flow from more ordered (lower entropy) states to less ordered (higher entropy) states.

    Every observer observes its own world from its own point of view. That world is nothing more than forms animated on a viewing screen. That world only appears three dimensional since those forms are holographic. Those forms tend to self-replicate in form over a sequence of events in the flow of energy as behaviors are enacted due to coherent organization, which is how an animation of forms is animated. The principle of equivalence helps us understand how that animation is animated. As the observer focuses its attention upon that world of form, there is an expenditure of energy. That expenditure of energy places the observer in an accelerating frame of reference, just like a rocket ship that expends energy as it accelerates through empty space. The difficult thing to wrap our minds around is there is no such thing as a rocket ship, except as an observable image that is projected from a viewing screen to a point of view. An event horizon always appears from that accelerated point of view. Every observer that expends energy is in an accelerating frame of reference, and observes an event horizon, which holographically defines the entire world of form that the observer observes. Due to quantum uncertainty, virtual pairs appear to separate at the horizon, and information is encoded on the horizon. Information is coherently organized into form, and the horizon acts as a holographic viewing screen that projects images to that point of view. That is how the observer’s world of form appears from its point of view, as it focuses its attention on those forms. There is an expenditure of energy, which places the observer in an accelerating frame of reference, and leads to the animation of that world. As the observer focuses its attention upon those forms, there is an investment of emotional energy in that animation. That investment of emotional energy animates the form of the observer’s body, which is the central form of its world. The viewing screen displays an entire world, but the central form of a particular body is always displayed on the viewing screen from the central point of view of that particular observer.

    The observer’s world, perceived from its point of view, shares information with other worlds, perceived from other points of view, since those different worlds are entangled. Each world is defined on a viewing screen that defines a state of information, but different states of information can become entangled with each other and share information, and so different forms can appear on each viewing screen in addition to the central form. The observer’s investment of emotional energy animates the form of its own body, which is the central form of its world. The investment of emotional energy from another point of view animates the central form of another world. Other forms can appear in any observer’s world since those different worlds are entangled and share information. The focus of attention of the observer of any particular world from any particular point of view leads to an investment of emotional energy in that world, which animates the central form of a body in that world. Other bodies are animated in that world due to the focus of attention of the other observers, each of which observes their own world from their own point of view. Those different worlds are animated together since those different states of information are entangled together. Collectively, those entangled states of information define consensual reality, which is not a single objective reality, but many different worlds that are each observed from their own point of view, and which only share information with each other.

    From the point of view of any particular observer, the observer’s entire world is displayed on a viewing screen that defines a state of information. That world is animated as energy flows in the sense of thermodynamics. A state of information is defined by the way information is encoded on all the pixels on the screen. In the sense of quantum theory, every event is a decision point where the quantum state branches, since information can become encoded in many different ways. The only reason we have a sense of the flow of time is due to the second law, as states of information tend to become more disordered. That increase in entropy applies to the entire world displayed on the viewing screen. Within that world, a form of information may become more ordered, and may coherently self-replicate its form, as long as the entire world becomes more disordered. A local increase in order of a particular form can only occur at the expense of some other form, which becomes more disordered. This is obviously a problem if every observer wants to survive in the form of its own body. Emotional expressions inherently are self-defensive in nature since they defend the survival (the self-replication) of the form of a particular body observed from a particular point of view. In spite of this problem, there is a natural way for the universe to evolve over time. That natural evolution takes the path of least action, which is the most likely path in the sense of quantum probability, and is the most energy efficient way for the universe to evolve. But from the point of view of any particular observer, the path of least action may not maximize the probability of survival of its own body, and so for selfish reasons, an alternative path may be taken in order to maximize the chances of body survival. Although the quantum state constantly branches, the observer of any world only observes the particular path that is taken in that world, but that path shares information with the path of other entangled worlds.

    After a careful reading of what Tom has to say about holographic cosmology and quantum theory, I just don’t see how it is possible to draw any logical conclusion except for what is described above. If Tom can shoot these ideas down and give an alternative explanation, I’d sure like to see his explanation.

  • TimG

    I don’t get it. If the wave function doesn’t exist, what does? If you say all that exists is the “results of experiments”, who is performing the experiments? And what are they experimenting on?

    Perhaps it’s a prejudice to assume the world is deterministic rather than stochastic. But you seem to be going far beyond suggesting a stochastic universe. What you describe sounds more like “a universe that consists of nothing, and that nothing is governed by laws.”

  • HDR

    Jim Kowall, are you the author of the rambling semi-coherent book Nonduality, a scientific primer at http://www.nonduality.com/hl3742.htm? You sound exactly like him.

  • Pingback: On Determinism | Cosmic Variance | Discover Magazine

  • Pingback: Conscious Resonance » New Theory Explains How Objective Reality Emerges from the Strange Underlying Quantum World

  • Pingback: New Theory Explains How Objective Reality Emerges from the Strange Underlying Quantum World | Квант Физик

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Cosmic Variance

Random samplings from a universe of ideas.

About Sean Carroll

Sean Carroll is a Senior Research Associate in the Department of Physics at the California Institute of Technology. His research interests include theoretical aspects of cosmology, field theory, and gravitation. His most recent book is The Particle at the End of the Universe, about the Large Hadron Collider and the search for the Higgs boson. Here are some of his favorite blog posts, home page, and email: carroll [at] cosmicvariance.com .

ADVERTISEMENT

See More

ADVERTISEMENT
Collapse bottom bar
+

Login to your Account

X
E-mail address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it e-mailed to you.

Not Registered Yet?

Register now for FREE. Registration only takes a few minutes to complete. Register now »