Sunday, December 12, 2010

Uncertainty Principle Determines Nonlocality

This article was written by Florin Moldoveanu wrote on Nov. 26, 2010 in (http://www.fqxi.org/community/forum/topic/786)

In 1935, Einstein, Podolski, and Rosen published their seminal paper aimed at proving that quantum mechanics is incomplete. Quantum mechanics prohibits the simultaneous measurement of certain properties like, for example, position and momentum. But is this due to the inherently clumsiness of us, macroscopic objects? Or is this uncertainty an inherent property of nature?

Einstein, Podolski, and Rosen thought that at its core nature is deterministic and to prove it they devised a clever thought experiment: have a quantum system that splits into two parts. For example have an unstable particle initially at rest split into two identical particles flying apart in opposite directions. Then measure the position on the left particle, and the momentum on the right particle. Because of momentum conservation, the left particle should have the opposite value of momentum, and since we can measure both position and momentum (on different systems) with arbitrary precision this would violate the uncertainty principle, making quantum mechanics an incomplete theory. On the other hand, taking the point of view that quantum mechanics is indeed a complete theory would imply an unpalatable “spooky action at a distance” where correlations between two spatially separated systems would occur in a way incompatible with any local classical description as shown by John Bell.

It turns out however, that this “spooky action at a distance”, or nonlocality is really how nature behaves, and this was experimentally settled by the Aspect experiment. But there is more: Einstein, Podolski, and Rosen had wanted to prove the uncertainty principle wrong because of “unphysical” nonlocality, but in a recent Science paper, Jonathan Oppenheim and Stephanie Wehner showed––in a bit of an ironic twist––that the “spooky action at a distance” is determined in part by the uncertainty principle.

Now, why is this important? Because as much as quantum mechanics is strange and predicts many counter-intuitive phenomena, it is not as strange as it could be allowed by the no-signaling condition of relativity. In other words, there could be stronger correlations then those predicted by quantum mechanics between the two measurements, while still obeying the condition that whatever I do “over here” does not send any signal “over there.” One example of this is the so-called Popescu-Rohrlich (PR) box, a hypothetical unphysical device able to achieve the maximum correlations between two spatially separated systems. So why is a PR box not allowed by nature? A physical implementation of a PR box would be a hacker’s dream come true because it would allow unrestricted eavesdropping on over-correlated data.

But how can one reason meaningfully over unphysical situations? To what degree do we have to create “simulated realities” and how confident can we be that the conclusions one reaches are not just the author’s fantasies? Fortunately there is a clear answer: Discuss quantum mechanics and hypothetical theories in the language of information and game theory. This approach provides a platform which is guaranteed to reduce itself to quantum and classical mechanics in the appropriate limits, and also generate meaningful conclusions to all other possible physical theories. Of course, when additional requirements of standard axiomatizations of quantum mechanics are imposed (like projective geometries of Jordan algebras, for example) this continuum of potential theories reduces itself to a handful of discrete possible cases. But casting quantum mechanics in the new framework can add new insights and clarifications for old puzzles.

So how do Oppenheim and Wehner go about proving that the uncertainty principle determines nonlocality? In a nutshell, it goes like this. First, the uncertainty principle is expressed as a Shannon entropy inequality and then as a Deutsch min-entropy inequality. Then the typical Alice-Bob pair is set to play an “XOR retrieval game” proving that any violation of the Tsirelson’s bound implies a violation of the min-entropic uncertainty relations (for details, please see Oppenheim and Wehner’s paper arXiv:1004.2507v1). Now let’s dig in and explain all this.

John Bell proved that any local deterministic theory should obey what are now called Bell inequalities. Quantum mechanics violates those inequalities and Tsirelson asked something more: What is the maximal possible amount of those violations? Any larger violation of Bell’s inequality over the Tsirelson bound up to the maximal PR box correlation would represent an even spookier theory of nature.

In an XOR game, Alice and Bob are in states “s” and “t” respectively, and they generate the answers “a” and “b” (a and b are zero or one). Winning the game is determined by the XOR of a and b = a+b mod 2. The players are allowed to choose any strategy they want to maximize the chance of winning, but they cannot communicate with one another during the actual game. Classically, the best chance of winning is 3/4, but using quantum mechanics the odds can be increased up to 1/2 + 1/(2sqrt(2)). The way of doing it is by Alice performing a measurement in a preferred “eigenstate basis” and this “steers” the state of Bob to a maximally certain state. Discovered by Schrödinger, “steerability” allows Alice to influence the outcome of Bob’s experiments in a non-trivial way and, still, it does not transmit any information.

Ultimately, the best chance of winning an XOR game is given by the interplay of steerability and uncertainty relations. Intuitively, the larger the uncertainty, the worst are the odds, but this flies in the face of the common quantum mechanics intuition because the best outcome in quantum mechanics achieving the Tsirelson bound occurs precisely when incompatible measurements are chosen. But this is only an artifact of the power of steerability. Quantum mechanics already achieves maximum steerability and going above the Tsirelson bound requires LESS uncertainty (a PR box would have perfect steerability and no uncertainty). Classically, any hidden variable theory would have no uncertainty, but its steerability would be limited to the trivial case.

At this point the following problem presents itself. Quaternionic quantum mechanics can go over the Tsirelson bound and this representation does not have less uncertainty than standard quantum mechanics over the complex numbers because its physical predictions are the same. Since steerability is already at maximum in complex quantum mechanics, this means that either the Tsirelson bound is dependent on the particular number system representation of quantum mechanics, or there is another parameter at play determining nonlocality. The question is open at this time.

Oppenheim and Wehner suggest another open problem: Clarify the relationship between uncertainty and complementarity because one can imagine theories with less degrees of complementarity than quantum mechanics but with the same degree of non-locality and uncertainty. Here complementarity measures the degree to which one measurement disturbs the next measurement.

Reasoning about hypothetical theories in an information theory and game approach opens the door to counter examples and new fruitful insights which could demystify the spooky part of quantum mechanics. Two thousand years ago, people held contests of large number multiplication using abacuses. To them, the fact that someday elementary school children would perform those multiplications with ease and faster than they could ever do it, looked spooky. All this was possible because of the transition from Roman to Arabic numerals. Maybe in the distant future the only thing spooky thing about quantum mechanics would be how spooky it looked to us with our “primitive” tools like Hilbert spaces and non-commutative observables. So let’s loosen up, play some games, and achieve a better intuition about nature:

“- Billy, your computer game time is up. Go upstairs and do your homework!”

“- But Mom, I AM doing my homework. I am developing a quantum mechanics intuition right now!”

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

No Evidence of Time before Big Bang

By Edwin Cartlidge( | December 10, 2010 | 29
Nature)

Our view of the early Universe may be full of mysterious circles -- and even triangles -- but that doesn't mean we're seeing evidence of events that took place before the Big Bang. So says a trio of papers taking aim at a recent claim that concentric rings of uniform temperature within the cosmic microwave background--the radiation left over from the Big Bang--might, in fact, be the signatures of black holes colliding in a previous cosmic 'aeon' that existed before our Universe.

The provocative idea was posited by Vahe Gurzadyan of Yerevan Physics Institute in Armenia and celebrated theoretical physicist Roger Penrose of the University of Oxford, UK. In a recent paper, posted on the arXiv preprint server, Gurzadyan and Penrose argue that collisions between supermassive black holes from before the Big Bang would generate spherically propagating gravitational waves that would, in turn, leave characteristic circles within the cosmic microwave background.

To verify this claim, Gurzadyan examined seven years' worth of data from NASA's Wilkinson Microwave Anisotropy Probe (WMAP) satellite, calculating the change in temperature variance within progressively larger rings around more than 10,000 points in the microwave sky. And indeed, he identified a number of rings within the WMAP data that had a temperature variance that was markedly lower than that of the surrounding sky.

Cosmic cycle

Most cosmologists believe that the Universe, and with it space and time, exploded into being some 13.7 billion years ago at the Big Bang, and that it has been expanding ever since. A crucial component of the standard cosmological model--needed to explain why the Universe is so uniform--is the idea that a fraction of a second after the Big Bang, the Universe underwent a brief period of extremely rapid expansion known as inflation.

Penrose, however, thinks that the Universe's great uniformity instead originates from before the Big Bang, from the tail end of a previous aeon that saw the Universe expand to become infinitely large and very smooth. That aeon in turn was born in a Big Bang that emerged from the end of a still earlier aeon, and so on, creating a potentially infinite cycle with no beginning and no end.

Now Gurzadyan and Penrose's idea is being challenged by three independent studies, all posted on the arXiv server within the past few days, by Ingunn Wehus and Hans Kristian Eriksen of the University of Oslo; Adam Moss, Douglas Scott and James Zibin of the University of British Columbia in Vancouver, Canada; and Amir Hajian of the Canadian Institute for Theoretical Astrophysics in Toronto, Ontario.

All three groups reproduced Gurzadyan's analysis of the WMAP data and all agree that the data do contain low-variance circles. Where they part company with the earlier work is in the significance that they attribute to these circles.

Circles of significance

To gauge this significance, Gurzadyan compared the observed circles with a simulation of the cosmic microwave background in which temperature fluctuations were completely scale invariant, meaning that their abundance was independent of their size. In doing so, he found that there ought not to be any patterns. But the groups who are critical of his work say that this is not what the cosmic microwave background is like.

They point out that the WMAP data clearly show that there are far more hot and cold spots at smaller angular scales, and that it is therefore wrong to assume that the microwave sky is isotropic. All three groups searched for circular variance patterns in simulations of the cosmic microwave background that assume the basic properties of the inflationary Universe, and all found circles that are very similar to the ones in the WMAP data.

Moss and his colleagues even carried out a slight variation of the exercise and found that both the observational data and the inflationary simulations also contain concentric regions of low variance in the shape of equilateral triangles. "The result obtained by Gurzadyan and Penrose does not in any way provide evidence for Penrose's cyclical model of the Universe over standard inflation," says Zibin.

Gurzadyan dismisses the critical analyses as "absolutely trivial", arguing that there is bound to be agreement between the standard cosmological model and the WMAP data "at some confidence level" but that a different model, such as Penrose's, might fit the data "even better"--a point he makes in a response to the three critical papers also posted on arXiv. However, he is not prepared to state that the circles constitute evidence of Penrose's model. "We have found some signatures that carry properties predicted by the model," he says.

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Tuesday, December 7, 2010

What Do You Think?! Decoherence &Quantum Darwinism


Quantum Darwinism is a theory explaining the emergence of the classical world from the quantum world as due to a process of Darwinian natural selection; where the many possible quantum states are selected against in favor of a stable pointer state. It is proposed by Wojciech Zurek and a group of collaborators including Ollivier, Poulin, Paz and Blume-Kohout. The development of the theory is due to the integration of a number of Zurek’s research topics pursued over the course of twenty-five years including: pointer states, einselection and decoherence.

A study in 2010 has provided preliminary supporting evidence of quantum Darwinism with scars of a quantum dot "becoming a family of mother-daughter states" indicating they could "stabilize into multiple pointer states.

Along with Zurek’s related theory of envariance, quantum Darwinism explains how the classical world emerges from the quantum world and proposes to answer the quantum measurement problem, the main interpretational challenge for quantum theory. The measurement problem arises because the quantum state vector, the source of all knowledge concerning quantum systems, evolves according to the Schrödinger equation into a linear superposition of different states, predicting paradoxical situations such as “Schrödinger's cat”; situations never experienced in our classical world. Quantum theory has traditionally treated this problem as being resolved by a non-unitary transformation of the state vector at the time of measurement into a definite state. It provides an extremely accurate means of predicting the value of the definite state that will be measured in the form of a probability for each possible measurement value. The physical nature of the transition from the quantum superposition of states to the definite classical state measured is not explained by the traditional theory but is usually assumed as an axiom and was at the basis of the debate between Bohr and Einstein concerning the completeness of quantum theory.

Quantum Darwinism explains the transition of quantum systems from the vast potentiality of superposed states to the greatly reduced set of pointer states[2] as a selection process, einselection, imposed on the quantum system through its continuous interactions with the environment. All quantum interactions, including measurements, but much more typically interactions with the environment such as with the sea of photons in which all quantum systems are immersed, lead to decoherence or the manifestation of the quantum system in a particular basis dictated by the nature of the interaction in which the quantum system is involved. In the case of interactions with its environment Zurek and his collaborators have shown that a preferred basis into which a quantum system will decohere is the pointer basis underlying predictable classical states. It is in this sense that the pointer states of classical reality are selected from quantum reality and exist in the macroscopic realm in a state able to undergo further evolution.

As a quantum system’s interactions with its environment results in the recording of many redundant copies of information regarding its pointer states, this information is available to numerous observers able to achieve consensual agreement concerning their information of the quantum state. This aspect of einselection, called by Zurek ‘Environment as a Witness’, results in the potential for objective knowledge.

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

OptoMechanics new concept of Quantum Optics


The concept that electromagnetic radiation can exert forces on material objects was predicted by Maxwell, and the radiation pressure of light was first observed experimentally more than a century ago. The force F exerted by a beam of power P retroreflecting from a mirror is F=2P/c. Because the speed of light is so large, this force is typically extremely feeble but does manifest itself in special circumstances (e.g., in the tails of comets and during star formation). Beginning in the 1970s, researchers were able to trap and manipulate small particles and even individual atoms with optical forces.

Recently there has been a great surge of interest in the application of radiation forces to manipulate the center-of-mass motion of mechanical oscillators covering a huge range of scales from macroscopic mirrors in the Laser Interferometer Gravitational Wave Observatory (LIGO) project to nano- or micromechanical cantilevers, vibrating microtoroids, and membranes. Positive radiation pressure damping permits cooling of the motion; negative damping permits parametric amplification of small forces. Cooling a mechanical system to its quantum ground state is a key goal of the new field of optomechanics. Radiation pressure also appears in the form of unavoidable random backaction forces accompanying optical measurements of position as the precision of those measurements approaches the limits set by quantum mechanics [18, 19]. The randomness is due to the photon shot noise, the observation of which is a second key goal of the field.

In pioneering work, Braginsky and collaborators first detected mechanical damping due to radiation in the decay of an excited oscillator. Very recently, both measurement and mechanical damping of (the much smaller) random thermal Brownian motion (i.e., cooling of the center-of-mass motion) was achieved by several groups using different techniques.

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Really Great Books in Quantum Optics!

* L. Mandel, E. Wolf Optical Coherence and Quantum Optics (Cambridge 1995)
* D. F. Walls and G. J. Milburn Quantum Optics (Springer 1994)
* C. W. Gardiner and Peter Zoller, Quantum Noise, (Springer 2004).
* M. O. Scully and M. S. Zubairy Quantum Optics (Cambridge 1997)
* W. P. Schleich Quantum Optics in Phase Space (Wiley 2001)
* W. Vogel, D. Welsch Lectures on quantum optics
* Christopher C. Gerry, Peter L. Knight Introductory quantum optic

What is the quantum Optics

History of quantum optics
Light is made up of particles called photons and hence inherently is "grainy" (quantized). Quantum optics is the study of the nature and effects of light as quantized photons. The first indication that light might be quantized came from Max Planck in 1899 when he correctly modeled blackbody radiation. By assuming that, Bohr showed that the atoms were also quantized, in the sense that they could only emit discrete amounts of energy. The understanding of the interaction between light and matter following these developments not only formed the basis of quantum optics but were also crucial for the development of quantum mechanics as a whole. However, the subfields of quantum mechanics dealing with matter-light interaction were principally regarded as research into matter rather than into light; hence one rather spoke of atom physics and quantum electronics in 1960. Laser science—i.e., research into principles, design and application of these devices—became an important field, and the quantum mechanics underlying the laser's principles was studied now with more emphasis on the properties of light, and the name quantum optics became customary.

As laser science needed good theoretical foundations, and also because research into these soon proved very fruitful, interest in quantum optics rose. Following the work of Dirac in quantum field theory, George Sudarshan, Roy J. Glauber, and Leonard Mandel applied quantum theory to the electromagnetic field in the 1950s and 1960s to gain a more detailed understanding of photodetection and the statistics of light (see degree of coherence). This led to the introduction of the coherent state as a quantum description of laser light and the realization that some states of light could not be described with classical waves. In 1977, Kimble et al. demonstrated the first source of light which required a quantum description: a single atom that emitted one photon at a time. This was the first conclusive evidence that light was made up of photons. Another quantum state of light with certain advantages over any classical state, squeezed light, was soon proposed. At the same time, development of short and ultrashort laser pulses—created by Q switching and modelocking techniques—opened the way to the study of unimaginably fast ("ultrafast") processes. Applications for solid state research (e.g. Raman spectroscopy) were found, and mechanical forces of light on matter were studied. The latter led to levitating and positioning clouds of atoms or even small biological samples in an optical trap or optical tweezers by laser beam. This, along with Doppler cooling was the crucial technology needed to achieve the celebrated Bose-Einstein condensation.

Other remarkable results are the demonstration of quantum entanglement, quantum teleportation, and (recently, in 1995) quantum logic gates. The latter are of much interest in quantum information theory, a subject which partly emerged from quantum optics, partly from theoretical computer science.

Today's fields of interest among quantum optics researchers include parametric down-conversion, parametric oscillation, even shorter (attosecond) light pulses, use of quantum optics for quantum information, manipulation of single atoms, Bose-Einstein condensates, their application, and how to manipulate them (a sub-field often called atom optics), coherent perfect absorbers, and much more.

Research into quantum optics that aims to bring photons into use for information transfer and computation is now often called photonics to emphasize the claim that photons and photonics will take the role that electrons and electronics now have.