Hauling Out the Quantum Frigidaire: Can Quantum Mechanics Suck the Heat From Computing?

By Veronique Greenwood | June 2, 2011 12:46 pm


What’s the News: Anyone who has had their thighs baked by a laptop knows that computing releases heat. And it’s more than a common-sense maxim: physicists have shown that heat released by information processing is bound by a physical law, where a bit of information processed must cause a corresponding rise in temperature. But could quantum mechanics allow computations that actually cool computers down? In a recent Nature paper, researchers describe how this paradox is possible.

How the Heck:

  • In this paper, the team describes how, using the quantum mechanical property of entanglement, an observer can actually drain heat from a system while deleting information.
  • How, you say? It all comes down to a question of entropy. The second law of thermodynamics states that the entropy of a system is always increasing or remaining the same, but never decreasing. And the way we usually experience it, entropy is heat. Landauer’s Principle, which arises from the second law, links heat and information processing: any irreversible computation, the principle says, is going to add entropy to the universe.
  • But if a computation is reversible—if a 1 or 0 is deleted while its state is recorded somewhere, thus making it possible to recreate it—then no entropy should be released at all. Physicists have confirmed this math in the past.
  • Now researchers have shown that in quantum mechanics, entropy can be seen as a lack of knowledge on the part of the observer (that is, the experimenter) about the state of the 1s and 0s. In a quantum quirk, when an observer is entangled with bits of information, he has a great deal of detailed knowledge about those bits—so much, in fact, that entropy in the system goes into the negative numbers. Thus, when the observer makes a reversible deletion, he is actually siphoning heat off from the system. Voila: A computation that cools.

What’s the Context:

  • The idea of reversible computing and entropy has been hot stuff in the last half-century—it sits at the intersection of thermodynamics, the physics of heat, and information theory, the study of information. The senior author of this paper wrote an article on it way back in 1985 in Scientific American.
  • This result is likely to cause some heated discussion among physicists, as it comes very close to breaking the second law of thermodynamics. As the lead researcher warns in a blog post at Scientific American, “For many physicists, this is tantamount to saying that perpetual motion is possible!” (And we all know how they feel about that.)
  • The team argues, though, that because of the strange effects of entanglement, the second law is preserved.

The Future Holds: The team has described a theoretical situation, and now someone needs to step up to the plate and test it experimentally. Should it work out in the real world, the researchers think it could eventually be used to cool down supercomputers.

Reference: Lídia del Rio, Johan Åberg, Renato Renner, Oscar Dahlsten, Vlatko Vedral. The thermodynamic meaning of negative entropy. Nature, 2011; 474 (7349): 61 DOI: 10.1038/nature10123

Image credit: drewgstephens/flickr

CATEGORIZED UNDER: Physics & Math, Technology
  • Torbjörn Larsson, OM

    I doubt this after reading the arxiv paper. The trick is to run the usual erasure in reverse (fig 3 vs fig 2). But the sleight of hand is to claim that the preparation of the pure state in fig 3 is different from the one in fig 2, which is precisely the one step that can’t be done in isolation.

    How can the preparation be different? The authors seem to suppose they can prepare states lossless. (In effect, have an infinite supply of them.) But state preparation and how to erasure it is what the whole process is about in the first place.

  • Andy Galinsky

    Cooling could be accomplished by computing – calculating the entry of energy into a substance out of phase to cool on a molecular level.

    It will take energy to cool the target molecule by molecule – still a net transfer of energy. This just brings up more questions of net cooling and where out of phase energy actually goes…just like introducing out of phase light into a room to make it dark…

    This article, however actually sucked the intelligence out of me – thus leaving me dumber than before I read it.

  • nate

    “where a bit of information processed must cause a corresponding rise in temperature” -This is true, but it would be more accurate to say that ‘a bit of information ERASED must cause a corresponding rise in temperature”. Landauer showed that this was the only irreversible act in the process of computing, so its the only action that requires the generation of heat.

  • azbearhuntr

    My thighs are burning without a laptop, is that a problem?

  • http://harshj.com harshj

    Children from my country sing song of man who does nothing or does wrong thing. They sing Reshard Saar pwough pwough pwough pwough pwough pwough pwough pwough pwough pwough pwough pwough pwough pwough

  • http://cort.as/14lp acuvue oasys rebate

    That is highly imformative. I appreciate you all the information.


Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!


80beats is DISCOVER's news aggregator, weaving together the choicest tidbits from the best articles covering the day's most compelling topics.

See More

Collapse bottom bar