Monday, November 28, 2011

Coherent Schrödinger's cat still confounds



The famous paradox of Schrödinger's cat starts from principles of quantum physics and ends with the bizarre conclusion that a cat can be simultaneously in two physical states – one in which the cat is alive and the other in which it is dead. In real life, however, large objects such as cats clearly don't exist in a superposition of two or more states and this paradox is usually resolved in terms of quantum decoherence. But now physicists in Canada and Switzerland argue that even if decoherence could be prevented, the difficulty of making perfect measurements would stop us from confirming the cat's superposition.

Erwin Schrödinger, one of the fathers of quantum theory, formulated his paradox in 1935 to highlight the apparent absurdity of the quantum principle of superposition – that an unobserved quantum object is simultaneously in multiple states. He envisaged a black box containing a radioactive nucleus, a Geiger counter, a vial of poison gas and a cat. The Geiger counter is primed to release the poison gas, killing the cat, if it detects any radiation from a nuclear decay. The grisly game is played out according to the rules of quantum mechanics because nuclear decay is a quantum process.

If the apparatus is left for a period of time and then observed, you may find either that the nucleus has decayed or that it has not decayed, and therefore that the poison has or has not been released, and that the cat has or has not been killed. However, quantum mechanics tells us that, before the observation has been made, the system is in a superposition of both states – the nucleus has both decayed and not decayed, the poison has both been released and not been released, and the cat is both alive and dead.
Mixing micro and macro

Schrödinger's cat is an example of "micro-macro entanglement", whereby quantum mechanics allows (in principle) a microscopic object such as an atomic nucleus and a macroscopic object such as a cat to have a much closer relationship than permitted by classical physics. However, it is clear to any observer that microscopic objects obey quantum physics, while macroscopic things obey the classical physics rules that we experience in our everyday lives. But if the two are entangled it is impossible that each can be governed by different physical rules.

The most common way to avoid this problem is to appeal to quantum decoherence, whereby multiple interactions between an object and its surroundings destroy the coherence of superposition and entanglement. The result is that the object appears to obey classical physics, even though it is actually following the rules of quantum mechanics. It is impossible for a large system such as a cat to remain completely isolated from its surroundings, and therefore we do not perceive it as a quantum object.

While not disputing this explanation, Christoph Simon and a colleague at the University of Calgary, and another at the University of Geneva, have asked what would happen if decoherence did not affect the cat. In a thought experiment backed up by computer simulations, the physicists consider pairs of photons (A and B) generated from the same source with equal and opposite polarizations, travelling in opposite directions. For each pair, photon A is sent directly to a detector, but photon B is duplicated many times by an amplifier to make a macroscopic light beam that stands in for the cat. The polarizations of the photons in this light beam are then measured.
Two types of amplifier

They consider two different types of amplifier. The first measures the state of photon B, which has the effect of destroying the entanglement with A, before producing more photons with whatever polarization it measures photon B to have. This is rather like the purely classical process of observing the Geiger counter to see whether it has detected any radiation, and then using the information to decide whether or not to kill the cat. The second amplifier copies photon B without measuring its state, thus preserving the entanglement with A.

The researchers ask how the measured polarizations of the photons in the light beam will differ depending on which amplifier is used. They find that, if perfect resolution can be achieved, the results look quite different. However, with currently available experimental techniques, the differences cannot be seen. "If you have a big system and you want to see quantum features like entanglement in it, you have to make sure that your precision is extremely good," explains Simon. "You have to be able to distinguish a million photons from a million plus one photons, and there is no current technology that would allow you to do that."

Quantum-information theorist Renato Renner of ETH Zurich is impressed: "Even if there was no decoherence, this paper would explain why we do not see quantum effects and why the world appears classical to us, which is a very fundamental question of course." But, he cautions, "The paper raises a very fundamental question and gives us an answer in an interesting special case, but whether it is general remains to be seen."

The research will be published in Physical Review Letters.

Friday, October 14, 2011

New twist on Brownian motion seen for the first time

An important aspect of Brownian motion predicted decades ago has been observed for the first time by researchers in Europe. The team has measured how micrometre-sized spheres interact with a surrounding fluid and have shown that the spheres "remember" their previous motion. Their experimental technique, the researchers claim, could be used as a biophysical sensor. Famously explained by Albert Einstein in 1905, Brownian motion describes the erratic motion of a tiny particle in a fluid. It is caused by the many small "kicks" that the particle receives as a result of the thermal motion of the fluid. Initially, Einstein and other physicists believed these kicks to be independent of the motion of the particle and to be characterized by white noise.

Remembering motion
In the mid-20th century, however, physicists began to realize that when the densities of the particle and fluid are similar, the kicks are not completely random. Instead, "persistent correlations" are predicted between the motions of the fluid and the particle. These arise because particles moving through a fluid will cause the surrounding fluid to move, which in turn will affect the motion of the particle and so on. For example, a person swimming at a constant speed will pull some of the surrounding water with them. But if they stop suddenly, they will feel a push forward from the moving water. Researchers refer to this as "hydrodynamic memory", but its observation has remained elusive for the tiny single particles that undergo Brownian motion. Now, Sylvia Jeney at EPFL in Switzerland and colleagues in Switzerland and Germany claim to have seen clear evidence for this effect in the Brownian motions of particles. Their measurements are based on the idea that this hydrodynamic "memory" gives rise to the power spectrum of the particle being described by "coloured noise", rather than white noise. In the context of Brownian motion, white noise means that the particle fluctuates with the same magnitude (or power) regardless of the frequency of the fluctuation. Jeney's experiments, however, show that higher frequencies actually have higher magnitudes of fluctuation – which means that the noise is no longer white but is coloured.

Specialized trap
Jeney's group made the measurement by trapping a single micrometre-sized melamine sphere in optical tweezers created by a tightly focused laser beam. Although similar to a commercial set-up already used by biophysicists, the researchers spent several years optimizing their apparatus. In particular, they improved the time resolution of the system by a factor of 1000 and boosted its spatial resolution so it can measure distances of less than a nanometre. The experiments involved single particles trapped by the tweezers and immersed in liquid. The parameters of the experiment were chosen so that time it takes for the fluid to diffuse over the diameter of the particle is about one-sixth of the time it takes for the sphere to reach its equilibrium position in the tweezers. This diffusion time is the timescale on which the hydrodynamic memory is expected to occur and therefore the set-up allowed the researchers to study the correlated behaviour. "Currently, there are two maybe three labs in the world that have similar high-precision set-ups," explains Jeney. She says that the team wants to establish the optical-trapping technique as an advanced biophysical tool.
Source: IOP/Physics
Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Tuesday, September 13, 2011

Fine-tuning lasers to find waves in gravity

SYDNEY: 'Squeezing' laser light could significantly improve the accuracy of detectors searching for Einstein's elusive gravitational waves.


Gravitational waves were predicted by Einstein but have long remained undetected. To look for them, scientists use devices called laser interferometers, which measure the time it takes a split beam of laser light to travel between suspended mirrors. The waves are expected to distort the laser's travel time - but so far scientists can't measure this accurately enough.

The accuracy of these devices is limited by a quantum phenomenon of light called 'shot noise' - a type of electronic interference.

Using a new quality of laser light, which radiates much more calmly than a conventional laser, researchers reported yesterday in Nature Physics that they have curbed this interference and improved measuring accuracy in their detectors by roughly 50%.

"Squeezed light is a completely new approach," said physicist and lead author Roman Schnabel, from the Max Planck Institute for Gravitational Physics in Germany.

"One can say that for the first time a `technology' is based on one of the distinct features of quantum physics itself. We were able to leave the stage of laboratory experiments and realize a real application."

Travel time weakens waves

The findings are an an exciting step forward for the Laser Interferometry Gravitational-Wave Observatory (LIGO) project in its quest to observe gravitational waves using Earth-based detectors.

Albert Einstein first predicted the existence of gravitational waves in 1916 in his theory of general relativity. The presence of large amounts of mass or energy can distort the space-time fabric causing it to curve, and when they move suddenly, this curvature ripples outward - like the ripples in a pond after a fish jumps.

Violent astronomical events such as black hole collisions and supernovae can cause gravitational waves. In the immediate vicinity of these objects, gravitational waves would be immensely strong, said Schnabel.

However, after travelling billions of light years to reach the Earth they are significantly weakened, making them incredibly difficult to detect. So far they have eluded scientists.

Theoretical predictions based on Einstein's theory indicate current detectors must be improved by another factor of about three to 10 to reach a high probability of successful detection, said Schnabel.

"Squeezed light is a new technology, which has now proven to significantly contribute to realising this last factor," Schnabel said. "[And] the improvement factor of 1.5 is just the beginning. A factor of three due to squeezed light is possible with today's technology."

Lasers superimposed

In a laser interferometer, a laser is split into two beams and shot down long, perpendicular vacuum tubes before reflecting off mirrors back to where they started.

If the distance measured by the light is exactly the same, all the light will be directed back to the original source, but if there is any difference in the distance, some light will be redirected to a photodetector for further analysis.

The idea is that space-time ripples caused by gravitational waves will cause the distance measured by the light beam to change, and the amount of light falling on the photodector to vary.

"We now feed the squeezed light into the interferometer, in addition to our normal laser light," explained Schnabel. "If the two light fields then superimpose, the resulting laser beam has a much more uniform intensity, compared to the original signal beam.

"We thus smooth out the irregularities caused by quantum physical effects in the detector signal," he added.

New view on the universe

"This is the first time this technology has been used outside of a test laboratory anywhere in the world," said David McClelland, a physicist at the Australian National University in Canberra and a key investigator for LIGO-Australia.

"The detection of gravitational waves would open a new window for astronomy and create a completely new way of sensing the Universe, akin to being able to hear for the very first time," he added.

The LIGO collaboration is in the process of testing a squeezed light source built at the ANU on the 4 km long LIGO interferometer in Washington State in the U.S. According to Schnabel, testing on this observatory and Europe's envisaged 10km Einstein Telescope, could further improve detection capabilities.

source:Cosmos Online
Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Sunday, August 21, 2011

Quantum mechanics rule 'bent' in classic experiment

Researchers have bent one of the most basic rules of quantum mechanics, a counterintuitive branch of physics that deals with atomic-scale interactions.

Its "complementarity" rule asserts that it is impossible to observe light behaving as both a wave and a particle, though it is strictly both.

In an experiment reported in Science, researchers have now done exactly that.
Light can interfere with itself just as water ripples can add to or cancel one another
They say the feat "pulls back the veil" on quantum reality in a way that was thought to be prohibited by theory.

Quantum mechanics has spawned and continues to fuel spirited debates about the nature of what we can see and measure, and what nature keeps hidden - debates that often straddle the divide between the physical and the philosophical.

For instance, a well-known rule called the Heisenberg uncertainty principle maintains that for some pairs of measurements, high precision in one necessarily reduces the precision that can be achieved in the other.

One embodiment of this idea lies in a "two-slit interferometer", in which light can pass through one of two slits and is viewed on a screen.

Let a number of the units of light called photons through the slits, and an interference pattern develops, like waves overlapping in a pond. However, keeping a close eye on which photons went through which slits - what may be termed a "strong measurement" - destroys the pattern.
Continue reading the main story

Young's two-slit experiment

A central idea in quantum mechanics is that light and matter can behave as both particle and wave
However, the idea of "complementarity" prevents observation of both behaviours simultaneously
In the two-slit experiment, light is passed through two tiny holes and is then viewed on a screen
The two beams interfere with each other, forming a rippled "diffraction pattern" - as if the light were made of a number of waves adding or cancelling
However, if one of the holes is blocked, the light can be seen as a single beam on the screen - as if light were made of particles
The new work, for the first time, observes both kinds of behaviour at the same time


Now, Aephraim Steinberg of the University of Toronto and his colleagues have sidestepped this limitation by undertaking "weak measurements" of the photons' momentum.

The team allowed the photons to pass through a thin sliver of the mineral calcite which gave each photon a tiny nudge in its path, with the amount of deviation dependent on which slit it passed through.

By averaging over a great many photons passing through the apparatus, and only measuring the light patterns on a camera, the team was able to infer what paths the photons had taken.

While they were able to easily observe the interference pattern indicative of the wave nature of light, they were able also to see from which slits the photons had come, a sure sign of their particle nature.

The trajectories of the photons within the experiment - forbidden in a sense by the laws of physics - have been laid bare.

On one level, the experiment appears to violate a central rule of quantum mechanics, but Professor Steinberg said this was not the case.

He explained to BBC News that "while the uncertainty principle does indeed forbid one from knowing the position and momentum of a particle exactly at the same time, it turns out that it is possible to ask 'what was the average momentum of the particles which reached this position?'" .

"You can't know the exact value for any single particle, but you can talk about the average."
Philosophical beginnings

Marlan Scully of Texas A&M University, a quantum physicist who has published on the idea of sneaking around this quantum limit before, said: "It's a beautiful series of measurements by an excellent group, the likes of which I've not seen before.

"This paper is probably the first that has really put this weak measurement idea into a real experimental realisation, and it also gave us the trajectories."

He said that the work would - inevitably - raise philosophical issues as well.

"The exact way to think about what they're doing will be researched for some time, and the weak measurement concept itself will be a matter of controversy - but now we have a very pretty experiment with these weak measurements," he added.

For his part, Professor Steinberg believes that the result reduces a limitation not on quantum physics but on physicists themselves.

"I feel like we're starting to pull back a veil on what nature really is," he said.

"The trouble with quantum mechanics is that while we've learned to calculate the outcomes of all sorts of experiments, we've lost much of our ability to describe what is really happening in any natural language.

"I think that this has really hampered our ability to make progress, to come up with new ideas and see intuitively how new systems ought to behave."

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Thursday, July 28, 2011

Generation of motional nonlinear coherent states and their superpositions via an intensity-dependent coupling of a cavity field to a micromechanical membrane

This paper is available in J. Phys. B: At. Mol. Opt. Phys. 44 (2011) 105504 (14pp)

In this paper, we have introduced a physical scheme that allows
one to generate and control the nonclassical properties of
motional nonlinear coherent states and their superpositions
for an undamped vibrating micromechanical membrane inside
an optical cavity. We have shown that if the cavity field
is initially prepared in a Fock state, the motional state of
the membrane may evolve to a family of nonlinear coherent
states. We have been interested in analysing the nonclassical
properties of the generated state of the membrane, including
the quadrature squeezing and the sub-Poissonian statistics. In
particular, we have found that the Lamb–Dicke parameter and
the membrane’s reflectivity lead to an enhancement of the
nonclassical properties. As we have seen, with increasing
the Lamb–Dicke parameter and the membrane’s reflectivity,
the sub-Poissonian behaviour and quadrature squeezing of the
motional state of the membrane are considerably strengthened.
In addition, the scheme offers the possibility of generating
various types of the so-called nonlinear multicomponent
Schr¨odinger cat states of the membrane. We have shown
that the separation between nonlinear coherent components is
increased by increasing the parameters η and rc.


We have also extended our treatment to a more realistic situation in
which the photon leakage from the cavity as a relevant source
of decoherence is included and examined its influence on the
nonclassical characteristics of the generated motional states of
the membrane. We have shown that it is possible to control the
effect of the cavity field damping on the nonclassical behaviour
of the motional state of the membrane via the Lambe–Dicke
parameter and the membrane’s reflectivity. In particular,
we have found that the generated motional NLSCSs of the
membrane can be more robust against decoherence than the
usual Schr¨odinger cat states.

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Quantum Entanglement and Information

Edited by Sh.Barzanjeh from "http://www.physics.org/"

Quantum entanglement is a physical resource, like energy, associated with the peculiar nonclassical correlations that are possible between separated quantum systems. Entanglement can be measured, transformed, and purified. A pair of quantum systems in an entangled state can be used as a quantum information channel to perform computational and cryptographic tasks that are impossible for classical systems. The general study of the information-processing capabilities of quantum systems is the subject of quantum information theory.

1. Quantum Entanglement
2. Exploiting Entanglement: Quantum Teleportation
3. Quantum Information
4. Quantum Cryptography
5. Quantum Computation
6. Interpretative Remarks
Bibliography
Other Internet Resources
Related Entries

1. Quantum Entanglement

In 1935 and 1936, Schrödinger published a two-part article in the Proceedings of the Cambridge Philosophical Society in which he discussed and extended a remarkable argument by Einstein, Podolsky, and Rosen. The Einstein-Podolsky-Rosen (EPR) argument was, in many ways, the culmination of Einstein's critique of the orthodox Copenhagen interpretation of quantum mechanics, and was designed to show that the theory is incomplete. (See The Einstein-Podolsky-Rosen Argument in Quantum Theory and Copenhagen Interpretation of Quantum Mechanics.) In classical mechanics the state of a system is essentially a list of the system's properties — more precisely, it is the specification of a set of parameters from which the list of properties can be reconstructed: the positions and momenta of all the particles comprising the system (or similar parameters in the case of fields). The dynamics of the theory specifies how properties change in terms of a law of evolution for the state. Pauli characterized this mode of description of physical systems as a ‘detached observer’ idealization. See Pauli's letter to Born in The Born-Einstein Letters (Born, 1992; p. 218). On the Copenhagen interpretation, such a description is not possible for quantum systems. Instead, the quantum state of a system should be understood as a catalogue of what an observer has done to the system and what has been observed, and the import of the state then lies in the probabilities that can be inferred (in terms of the theory) for the outcomes of possible future observations on the system. Einstein rejected this view and proposed a series of arguments to show that the quantum state is simply an incomplete characterization of the system. The missing parameters are sometimes referred to as ‘hidden parameters’ or ‘hidden variables’ (although Einstein did not use this terminology, presumably because he did not want to endorse any particular ‘hidden variable’ theory).

It should not be supposed that Einstein's definition of a complete theory included the requirement that it be deterministic. Rather, he required certain conditions of separability and locality for composite systems consisting of separated component systems: each component system separately should be characterized by its own properties (even if these properties manifest themselves stochastically), and it should be impossible to alter the properties of a distant system instantaneously (or the probabilities of these properties) by acting on a local system. In later analyses — notably in Bell's extension of the EPR argument — it became apparent that these conditions, suitably formulated as probability constraints, are equivalent to the requirement that statistical correlations between separated systems should be reducible to probability distributions over common causes (deterministic or stochastic) in the sense of Reichenbach. (See Bell's Theorem and Reichenbach's Common Cause Principle.)

In the original EPR article, two particles are prepared from a source in a certain quantum state and then move apart. There are ‘matching’ correlations between both the positions of the two particles and their momenta: a measurement of either position or momentum on a particular particle will allow the prediction, with certainty, of the outcome of a position measurement or momentum measurement, respectively, on the other particle. These measurements are mutually exclusive: either a position measurement can be performed, or a momentum measurement, but not both simultaneously. Either correlation can be observed, but the subsequent measurement of momentum, say, after establishing a position correlation, will no longer yield any correlation in the momenta of the two particles. It is as if the position measurement disturbs the correlation between the momentum values. The puzzle is that the assumption of the completeness of the quantum state of the particle pair is inconsistent with the assignment of labels to the particles separately that could be associated with appropriately correlated values for the outcomes of position and momentum measurements. These labels would be the common causes of the correlations, and would provide an explanation of the correlations in terms of the initial correlations between the properties of the two systems at their source. EPR concluded that the quantum state was incomplete.

Here is how Schrödinger put the puzzle in the first part of his two-part article (Schrödinger, 1935; p. 559):

Yet since I can predict either x1 or p1 without interfering with the system No. 1 and since system No. 1, like a scholar in an examination, cannot possibly know which of the two questions I am going to ask first: it so seems that our scholar is prepared to give the right answer to the first question he is asked, anyhow. Therefore he must know both answers; which is an amazing knowledge; quite irrespective of the fact that after having given his first answer our scholar is invariably so disconcerted or tired out, that all the following answers are ‘wrong.’

What Schrödinger showed was that if two particles are prepared in a quantum state such that there is a matching correlation between two ‘canonically conjugate’ dynamical quantities — quantities like position and momentum whose values suffice to specify all the properties of a classical system — then there are infinitely many dynamical quantities of the two particles for which there exist similar matching correlations: every function of the canonically conjugate pair of the first particle matches with the same function of the canonically conjugate pair of the second particle. Thus (Schrödinger, p. 559) system No. 1 ‘does not only know these two answers but a vast number of others, and that with no mnemotechnical help whatsoever, at least with none that we know of.’

Schrödinger coined the term ‘entanglement’ to describe this peculiar connection between quantum systems (Schrödinger, 1935; p. 555):
When two systems, of which we know the states by their respective representatives, enter into temporary physical interaction due to known forces between them, and when after a time of mutual influence the systems separate again, then they can no longer be described in the same way as before, viz. by endowing each of them with a representative of its own. I would not call that one but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought. By the interaction the two representatives [the quantum states] have become entangled.

He added (Schrödinger, 1935; p. 555):

Another way of expressing the peculiar situation is: the best possible knowledge of a whole does not necessarily include the best possible knowledge of all its parts, even though they may be entirely separate and therefore virtually capable of being ‘best possibly known,’ i.e., of possessing, each of them, a representative of its own. The lack of knowledge is by no means due to the interaction being insufficiently known — at least not in the way that it could possibly be known more completely — it is due to the interaction itself.

Attention has recently been called to the obvious but very disconcerting fact that even though we restrict the disentangling measurements to one system, the representative obtained for the other system is by no means independent of the particular choice of observations which we select for that purpose and which by the way are entirely arbitrary. It is rather discomforting that the theory should allow a system to be steered or piloted into one or the other type of state at the experimenter's mercy in spite of his having no access to it.

In the second part of the paper, Schrödinger showed that, in general, a sophisticated experimenter can, by a suitable choice of operations carried out on one system, ‘steer’ the second system into any chosen mixture of quantum states. That is, the second system cannot be steered into any particular quantum state at the whim of the experimenter, but the experimenter can constrain the quantum state into which the second system evolves to lie in any chosen set of states, with a probability distribution fixed by the entangled state. He found this conclusion sufficiently unsettling to suggest that the entanglement between two separating systems would persist only for distances small enough that the time taken by light to travel from one system to the other could be neglected, compared with the characteristic time periods associated with other changes in the composite system. He speculated that for longer distances each of the two systems might in fact be in a state associated with a certain mixture, determined by the precise form of the entangled state.

Most physicists attributed the puzzling features of entangled quantum states to Einstein's inappropriate ‘detached observer’ view of physical theory, and regarded Bohr's reply to the EPR argument (Bohr, 1935) as vindicating the Copenhagen interpretation. This was unfortunate, because the study of entanglement was ignored for thirty years until John Bell's reconsideration and extension of the EPR argument (Bell, 1964). Bell looked at entanglement in simpler systems than the EPR case: matching correlations between two-valued dynamical quantities, such as polarization or spin, of two separated systems in an entangled state. What Bell showed was that the statistical correlations between the measurement outcomes of suitably chosen different quantities on the two systems are inconsistent with an inequality derivable from Einstein's separability and locality assumptions — in effect from the assumption that the correlations have a common cause.

Bell's investigation generated an ongoing debate on the foundations of quantum mechanics. One important feature of this debate was confirmation that entanglement can persist over long distances(see Aspect et al.), thus falsifying Schrödinger's supposition of the spontaneous decay of entanglement as two entangled particles separate. But it was not until the 1980s that physicists, computer scientists, and cryptographers began to regard the non-local correlations of entangled quantum states as a new kind of non-classical resource that could be exploited, rather than an embarrassment to be explained away. For further discussion of entanglement as a physical resource, including measuring entanglement, and the manipulation and purification of entanglement by local operations, see “The Joy of Entanglement” by Popescu and Rohrlich in Lo, Popescu, and Spiller 1998, or Nielsen and Chuang 2000.
2. Exploiting Entanglement: Quantum Teleportation

Consider again Schrödinger's realization that an entangled state could be used to steer a distant particle into one of a set of states, with a certain probability. In fact, this possibility of ‘remote steering’ is even more dramatic than Schrödinger demonstrated. Suppose Alice and Bob share an entangled state of the sort considered by Bell, say two photons in an entangled state of polarization. That is, Alice has in her possession one of the entangled photons, and Bob the other. Suppose that Alice has an additional photon in an unknown state of polarization |u>, where the notation ‘| >’ denotes a quantum state. It is possible for Alice to perform an operation on the two photons in her possession that will transform Bob's photon into one of four states, depending on the four possible (random) outcomes of Alice's operation: either the state |u>, or a state that is related to |u> in a definite way. Alice's operation entangles the two photons in her possession, and disentangles Bob's photon, steering it into a state |u*>. After Alice communicates the outcome of her operation to Bob, Bob knows either that |u*> = |u>, or how to transform |u*> to |u> by a local operation. This phenomenon is known as ‘quantum teleportation.’

What is extraordinary about this phenomenon is that Alice and Bob have managed to use their shared entangled state as a quantum communication channel to destroy the state |u> of a photon in Alice's part of the universe and recreate it in Bob's part of the universe. Since the state of a photon requires specifying a direction in space (essentially the value of an angle that can vary continuously), without a shared entangled state Alice would have to convey an infinite amount of classical information to Bob for Bob to be able to reconstruct the state |u> precisely. To see why this is so, consider that the decimal expansion of an angle variable represented by a real number is represented by a potentially infinite sequence of digits between 0 and 9. The binary expansion is represented by a potentially infinite sequence of 0's and 1's. Ever since Shannon formalized the notion of classical information, the amount of classical information associated with a binary alternative (represented as 0 or 1), where each alternative has equal probability, is measured as one binary digit or ‘bit’. So to specify the value of an arbitrary angle variable requires an infinite number of bits. To specify the outcome of Alice's operation, which has four possible outcomes with equal probabilities, requires two bits of classical information. Remarkably, Bob can reconstruct the state |u> on the basis of just two bits of classical information communicated by Alice, apparently by exploiting the entangled state as a quantum communication channel to transfer the remaining information. For further discussion of quantum teleportation, see Nielsen and Chuang 2000, or Richard Josza's article “Quantum Information and its Properties” in Lo, Popescu, and Spiller 1998.
3. Quantum Information

Formally, the amount of classical information we gain, on average, when we learn the value of a random variable (or, equivalently, the amount of uncertainty in the value of a random variable before we learn its value) is represented by a quantity called the Shannon entropy, measured in bits (Shannon and Weaver, 1949). A random variable is defined by a probability distribution over a set of values. In the case of a binary random variable, with equal probability for each of the two possibilities, the Shannon entropy is 1 bit, representing maximal uncertainty. For all other probabilities — intuitively, representing some information about which alternative is more likely — the Shannon entropy is less than 1. For the case of maximal knowledge or zero uncertainty about the alternatives, where the probabilities are 0 and 1, the Shannon entropy is zero. (Note that the term ‘bit’ is used to refer to the basic unit of classical information in terms of Shannon entropy, and to an elementary two-state classical system considered as representing the possible outputs of an elementary classical information source.)

Since information is always embodied in the state of a physical system, we can also think of the Shannon entropy as quantifying the physical resources required to store classical information. Suppose Alice wishes to communicate some classical information to Bob over a classical communication channel such as a telephone line, say an email message. A relevant question concerns the extent to which the message can be compressed without loss of information, so that Bob can reconstruct the original message accurately from the compressed version. According to Shannon's source coding theorem or noiseless coding theorem (assuming a noiseless telephone line with no loss of information), the minimal physical resource required to represent the message (effectively, a lower bound on the possibility of compression) is given by the Shannon entropy of the source.

What happens if we use the quantum states of physical systems to store information, rather than classical states? It turns out that quantum information is radically different from classical information. The unit of quantum information is the ‘qubit’, representing the amount of quantum information that can be stored in the state of the simplest quantum system, for example, the polarization state of a photon. The term is due to Schumacher (1995), who proved a quantum analogue of Shannon's noiseless coding theorem. (By analogy with the term ‘bit,’ the term ‘qubit’ refers to the basic unit of quantum information in terms of the von Neumann entropy, and to an elementary two-state quantum system considered as representing the possible outputs of an elementary quantum information source.) As we have seen, an arbitrarily large amount of classical information can be encoded in a qubit. This information can be processed and communicated but, because of the peculiarities of quantum measurement, at most one bit can be accessed! According to a theorem by Holevo, the accessible information in a probability distribution over a set of alternative qubit states is limited by the von Neumann entropy, which is equal to the Shannon entropy only when the states are orthogonal in the space of quantum states, and is otherwise less than the Shannon entropy.

While classical information can be copied or cloned, the quantum ‘no cloning’ theorem (Dieks, 1982; Wootters and Zurek, 1982) asserts the impossibility of cloning an unknown quantum state. To see why, consider how we might construct a classical copying device. A NOT gate is a device that takes a bit as input and produces as output either a 1 if the input is 0, or a 0 if the input is 1. In other words, a NOT gate is a 1-bit gate that flips the input bit. A controlled-NOT gate, or CNOT gate, takes two bits as inputs, a control bit and a target bit, and flips the target bit if and only if the control bit is 1, while reproducing the control bit. (So there are two inputs, the control and target, and two outputs: the control, and either the target or the flipped target, depending on the value of the control.) A CNOT gate functions as a copying device for the control bit if the target bit is set to 0, because the output of the target bit is then a copy of the control bit (i.e., the input 00 produces output 00, and the input 10 produces output 11). Insofar as we can think of a measurement as simply a copying operation, a CNOT gate is the paradigm of a classical measuring device. (Imagine Alice equipped with such a device, with input and output control and target wires, measuring the properties of an unknown classical world. The input control wire is a probe for the presence of absence of a property, represented by a 1 or a 0. The target wire functions as the pointer, which is initially set to 0. The output of the target is a 1 or a 0, depending on the presence or absence of the property.)

Suppose we attempt to use our CNOT gate to copy an unknown qubit state. Since we are now proposing to regard the CNOT gate as a device for processing quantum states, the evolution from input states to output states must be effected by a physical quantum transformation. Now quantum transformations are linear on the linear state space of qubits. Linearity of the state space means that for any two qubit states — call them |0> and |1> — that are orthogonal in the space of qubit states, there are qubit states that are represented by linear superpositions or sums of |0> and |1>, with certain coefficients. Such superpositions — e.g., a superposition with coefficients c0, c1 represented symbolically as c0|0> + c1|1> — are non-orthogonal to |0> and to |1>. Linearity of the transformation means that any transformation must take a qubit state represented by the sum of two orthogonal qubits to a new qubit state that is the sum of the transformed orthogonal qubits. If the CNOT gate succeeds in copying two orthogonal qubits, it cannot succeed in copying a linear superposition of these qubits. Since the gate functions linearly, it must instead produce a state that is a linear superposition of the outputs obtained for the two orthogonal qubits. That is to say, the output of the gate will be represented by a quantum state that is a sum of two terms, where the first term represents the output of the control and target for the first orthogonal qubit, and the second term represents the output of the control and target for the second orthogonal qubit. This could be expressed as c0|0>|0> + c1|1>|1>. This is an entangled state and not the output that would be required by a successful copying operation, where the control and target each outputs the superposed qubit, expressed as (c0|0> + c1|1>)(c0|0> + c1|1>).
4. Quantum Cryptography

Linearity prevents the possibility of cloning or measuring an unknown quantum state. Similarly, it can be shown that if Alice sends Bob one of two nonorthogonal qubits, Bob can obtain information about which of these qubits was sent only at the expense of disturbing the state. In general, for quantum information there is no information gain without disturbance. The impossibility of copying an unknown quantum state, or a state that is known to belong to a set of nonorthogonal states with a certain probability, and the existence of a trade-off relation between information gain and state disturbance, is the basis of the application of quantum information to cryptography. There are quantum protocols involving the exchange of classical and quantum information that Alice and Bob can exploit to share a secret random key, which they can then use to communicate privately. (See Lo's article “Quantum Cryptology” in Lo, Popescu, and Spiller, 1998.) Any attempt by an eavesdropper, Eve, to monitor the communication between Alice and Bob will be detectable, in principle, because Eve cannot gain any quantum information without some disturbance to the quantum communication channel. Moreover, the ‘no cloning’ theorem prohibits Eve from copying the quantum communications and processing them off-line, so to speak, after she monitors the classical communication between Alice and Bob.

While the difference between classical and quantum information can be exploited to achieve successful key distribution, there are other cryptographic protocols that are thwarted by quantum entanglement. Bit commitment is a key cryptographic protocol that can be used as a subroutine in a variety of important cryptographic tasks. In a bit commitment protocol, Alice supplies an encoded bit to Bob. The information available in the encoding should be insufficient for Bob to ascertain the value of the bit, but sufficient, together with further information supplied by Alice at a subsequent stage when she is supposed to reveal the value of the bit, for Bob to be convinced that the protocol does not allow Alice to cheat by encoding the bit in a way that leaves her free to reveal either 0 or 1 at will.

To illustrate the idea, suppose Alice claims the ability to predict advances or declines in the stock market on a daily basis. To substantiate her claim without revealing valuable information (perhaps to a potential employer, Bob) she suggests the following demonstration: She proposes to record her prediction, before the market opens, by writing a 0 (for ‘decline’) or a 1 (for ‘advance’) on a piece of paper, which she will lock in a safe. The safe will be handed to Bob, but Alice will keep the key. At the end of the day's trading, she will announce the bit she chose and prove that she in fact made the commitment at the earlier time by handing Bob the key. Of course, the key-and-safe protocol is not provably secure from cheating by Bob, because there is no principle of classical physics that that prevents Bob from opening the safe and closing it again without leaving any trace. The question is whether there exists a quantum analogue of this procedure that is unconditionally secure: provably secure by the laws of physics against cheating by either Alice or Bob. Bob can cheat if he can obtain some information about Alice's commitment before she reveals it (which would give him an advantage in repetitions of the protocol with Alice). Alice can cheat if she can delay actually making a commitment until the final stage when she is required to reveal her commitment, or if she can change her commitment at the final stage with a very low probability of detection.

It turns out that unconditionally secure two-party bit commitment, based solely on the principles of quantum or classical mechanics (without exploiting special relativistic signalling constraints, or principles of general relativity or thermodynamics) is impossible. See Mayers 1997, Lo and Chau 1997 and Lo's article “Quantum Cryptology” in Lo, Popescu, and Spiller 1998 for further discussion. Note that Kent (1999) has shown that one can implement a secure classical bit commitment protocol by exploiting relativistic signalling constraints in a timed sequence of communications between verifiably separated sites for both Alice and Bob.) Roughly, the impossibility arises because at any step in the protocol where either Alice or Bob is required to make a determinate choice (perform a measurement on a particle in the quantum channel, choose randomly and perhaps conditionally between a set of alternative actions to be implemented on the particle in the quantum channel, etc.), the choice can delayed by entangling one or more ‘ancilla’ particles with the channel particle in an appropriate way. By suitable operations on the ancillas, the channel particle can be ‘steered’ so that this cheating strategy is undetectable. In effect, if Bob can obtain no information about the bit in the safe, then entanglement will allow Alice to ‘steer’ the bit to either 0 or 1 at will.
5. Quantum Computation

Quantum information can be processed, but the accessibility of this information is limited by the Holevo bound (mentioned in Section 3). David Deutsch (1985) first showed how to exploit quantum entanglement to perform a computational task that is impossible for a classical computer. Suppose we have a black box or oracle that evaluates a function f. The arguments of f (inputs) are either 0 or 1. The values (outputs) of f (which are also 0 or 1) are either the same for both arguments (in which case f is constant), or different for the two arguments (in which case f is said to be ‘balanced’). We are interested in determining whether f is constant or balanced. Now, classically, the only way to do this is to run the black box or query the oracle twice, for both arguments 0 and 1, and to pass the values (outputs of f) to a circuit that determines whether they are the same (for ‘constant’) or different (for ‘balanced’). Deutsch showed that if we use quantum states and quantum gates to store and process information, then we can determine whether f is constant or balanced in one evaluation of the function f. The trick is to design the circuit (the sequence of gates) to produce the answer to a global question about the function (‘constant’ or ‘balanced’) in an output qubit register that can then be read out or measured.

Consider again the quantum CNOT gate, with two orthogonal qubits |0> and |1> as possible inputs for the control, and |0> as the input for the target. One can think of the input control and output target qubits, respectively, as the argument and associated value of a function. This CNOT function associates the value 0 with the argument 0 and the value 1 with the argument 1. For a linear superposition of the orthogonal qubits with equal coefficients as input to the control, represented as |0> + |1> (ignoring the coefficients, for simplity), and the qubit |0> as the input to the target, the output is the entangled state |0>|0> + |1>|1>, a linear superposition in which the first term represents the argument 0 and associated value (0) of the CNOT function, and the second term represents the argument 1 and associated value (1) of the CNOT function. The entangled state represents all possible arguments and corresponding values of the function as a linear superposition, but this information is not accessible. What can be shown to be accessible, by a suitable choice of quantum gates, is information about whether or not the function has certain global properties. This information is obtainable without reading out the evaluation of any individual arguments and values. (Indeed, accessing information in the entangled state about a global property of the function will typically require losing access to all information about individual arguments and values.)

The situation is analogous for Deutsch's function f. Here the output of f can be represented as either |0>|0> + |1>|0> or >|0>|1> + |1>|1> (in the ‘constant’ case), or |0>|0> + |1>|1> or |0>|1> + |1>|0> (in the ‘balanced’ case). The two entangled states in the ‘constant’ case are orthogonal in the 4-dimensional two-qubit state space and span a plane. Call this the ‘constant’ plane. Similarly, the two entangled states in the ‘balanced’ case span a plane, the ‘balanced’ plane. These planes are orthogonal in the 4-dimensional state space, except for an overlap: a line, representing a (non-entangled) two-qubit state. It is therefore possible to design a measurement to distinguish the two global properties of f, ‘constant’ or ‘balanced,’ with a certain probability (actually, 1/2) of failure, when the measurement yields an outcome corresponding to the overlap state, which is common to the two cases. Nevertheless, only one query of the function is required when the measurement succeeds in identifying the global property. With a judicious choice of quantum gates, it is even possible to design a quantum circuit that always succeeds in distinguishing the two cases in one run.

Deutsch's example shows how quantum information, and quantum entanglement, can be exploited to compute a global property of a function in one step that would take two steps classically. While Deutsch's problem is rather trivial, there now exist several quantum algorithms with interesting applications, notably Shor's factorization algorithm for factoring large composite integers in polynomial time (with direct application to ‘public key’ cryptography, a widely used classical cryptographic scheme) and Grover's database search algorithm. Shor's algorithm achieves an exponential speed-up over any known classical algorithm. For algorithms that are allowed access to oracles (whose internal structure is not considered), the speed-up can be shown to be exponential over any classical algorithm in some cases, e.g., Simon's algorithm. See Nielsen and Chuang 2000, Barenco's article “Quantum Computation: An Introduction” in Lo, Popescu, and Spiller 1998, Bub 2006 (Section 6), as well as the entry on quantum computing.

Note that there is currently no proof that a quantum algorithm can solve an NP-complete problem in polynomial time (the factorization problem is not NP-complete), so the efficiency of quantum computers relative to classical computers might turn out to be illusory. If there is indeed a speed-up, it would seem to be due to the phenomenon of entanglement. The amount of information required to describe a general entangled state of n qubits grows exponentially with n. The state space (Hilbert space) has 2n dimensions, so a general entangled state is a superposition of 2n n-qubit states. In classical mechanics there are no entangled states: a general n-bit composite system can be described with just n times the amount of information required to describe a single bit system. So the classical simulation of a quantum process would involve an exponential increase in the classical informational resource required to represent the quantum state, as the number of qubits that become entangled in the evolution grows linearly, and there would be a corresponding exponential slowdown in calculating the evolution, compared to the actual quantum computation performed naturally by the system. Nevertheless, there is no consensus in the literature as to what exactly explains the apparent speed-up. For a discussion, see Bub 2007, 2010.
6. Interpretative Remarks

Deutsch (1997) has argued that the exponential speed-up in quantum computation, and in general the way a quantum system processes information, can only be properly understood within the framework of Everett's ‘many-worlds’ interpretation (see Everett's Relative-State Formulation of Quantum Mechanics and Many-Worlds Intepretation of Quantum Mechanics). The idea, roughly, is that an entangled state of the sort that arises in the quantum computation of a function, which represents a linear superposition over all possible arguments and corresponding values of the function, should be understood as something like a massively parallel classical computation, for all possible values of a function, in parallel worlds. For an insightful critique of this idea of ‘quantum parallelism’ as explanatory, see Steane 2003.

An alternative view, not much discussed in the literature in this connection, is the quantum logical approach, which emphasizes the non-Boolean structure of properties of quantum systems. (The properties of a classical system form a Boolean algebra, essentially the abstract characterization of a set-theoretic structure. This is reflected in the Boolean character of classical logic, and the Boolean gates in a classical computer.) From this perspective, the picture is entirely different. Rather than ‘computing all values of a function at once,’ a quantum algorithm achieves an exponential speed-up over a classical algorithm by avoiding the computation of any values of the function at all. A crucial difference between quantum and classical information is the possibility of selecting an exclusive disjunction, representing a global property of a function, among alternative possible disjunctions — for example, the ‘constant’ disjunction asserting that the value of the function (for both arguments) is either 0 or 1, or the ‘balanced’ disjunction asserting that the value of the function (for both arguments) is either the same as the argument or different from the argument — without determining the truth values of the disjuncts. Classically, an exclusive disjunction is true if and only if one of the disjuncts is true. This is is redundant information in a quantum computation but essential information classically: an exclusive classical disjunction is true if and only if one of the disjuncts is true. In effect, Deutsch's quantum circuit achieves its speed-up by exploiting the non-Boolean structure of quantum properties to efficiently distinguish between two disjunctive properties, without determining the truth values of the relevant disjuncts (representing the association of individual arguments with corresponding function values). The point of the procedure is to avoid the evaluation of the function in the determination of the global property, in the sense of producing a value in the range of the function for a value in its domain, and it is this feature — impossible in the Boolean logic of classical computation — that leads to the speed-up relative to classical algorithms. For some recent work by Giuntini and others on logics associated with quantum gates, see under ‘quantum computational logics’ in the Other Internet Resources. (For quantum logic not specifically in relation to quantum computation, see the entry on quantum logic and quantum probability).

Some researchers in quantum information and quantum computation have argued for an information-theoretic interpretation of quantum mechanics. In his review article on quantum computation, Andrew Steane (1998, p. 119) makes the following remark:

Historically, much of fundamental physics has been concerned with discovering the fundamental particles of nature and the equations which describe their motions and interactions. It now appears that a different programme may be equally important: to discover the ways that nature allows, and prevents, information to be expressed and manipulated, rather than particles to move.

Steane concludes his review with the following radical proposal (1998, p. 171):

To conclude with, I would like to propose a more wide-ranging theoretical task: to arrive at a set of principles like energy and momentum conservation, but which apply to information, and from which much of quantum mechanics could be derived. Two tests of such ideas would be whether the EPR-Bell correlations thus became transparent, and whether they rendered obvious the proper use of terms such as ‘measurement’ and ‘knowledge’.

In line with this proposal, Clifton, Bub, and Halvorson 2003 showed that one can derive the basic kinematic features of a quantum description of physical systems from three fundamental information-theoretic constraints:

‘no signaling,’ i.e., no information should be available in the marginal probabilities of measurement outcomes in one region about alternative choices made by an agent in a separated region
the impossibility of perfectly broadcasting the information contained in an unknown physical state (which, for pure states, amounts to ‘no cloning’)
the impossibility of communicating information so as to implement a bit commitment protocol with unconditional security

The analysis is carried out in an algebraic framework (C*-algebras) which allows a mathematically abstract characterization of a physical theory including, as special cases, all classical mechanical theories of both wave and particle varieties, and all variations on quantum theory, including quantum field theories (plus any hybrids of these theories, such as theories with superselection rules). Within this framework, the three information-theoretic constraints are shown to jointly entail three physical conditions that are taken as definitive of what it means to be a quantum theory in the most general sense, specifically that:

the algebras of observables pertaining to distinct physical systems commute (a condition usually called microcausality or kinematic independence)
any individual system's algebra of observables is noncommutative
the physical world is nonlocal, in that spacelike separated systems can occupy entangled states that persist as the systems separate

As pointed out by Barnum, Dahlsten, Leifer, and Toner 2008, in spite of the apparent generality of the Clifton-Bub-Halvorson theorem, C*-algebraic theories in finite dimensions are essentially classical theories or quantum theories with superselection rules. Barnum et al. considered a framework of generalized probabilistic theories broad enough to include not only quantum and classical mechanics, but also a wide variety of other ‘superquantum’ theories that can serve as foils, and they proved similar results in this framework; specifically, they showed that for any nonclassical ‘no signaling’ theory that does not permit entanglement between systems, there is a bit-commitment protocol that is exponentially secure in the number of systems involved. See Barrett 2007 and Barnum, Barrett, Leifer, and Wilce 2007 and Barnum, Barrett, Leifer, and Wilce 2006 and 2008 (Other Internet Resources). Other researchers have considered the problem of what constraints in the class of ‘no signaling’ theories would characterize quantum theories. See Brassard 2005, van Dam 2005 (Other Internet Resources), Skrzypczyk, Brunner, and Popescu 2009 (Other Internet Resources), Pawlowski et al. 2009, Allcock et al. 2009, Navascues and Wunderlich 2009) for interesting results along these lines. For the relation between the generalized probabilistic theory approach of Barnum et al. and the category-theoretic approach of Coecke et al, see Barnum and Wilce 2008a and 2008b (Other Internet Resources). See Brukner and Zeilinger 2002 for a different information-theoretic approach to quantum mechanics, and Fuchs 2002 (Other Internet Resources) for a radically Bayesian information-theoretic perspective. For an insightful analysis and critique of the Brukner-Zeilinger position, see Timpson 2003.
Bibliography

Allcock, J., Brunner, N., Pawlowski, M., Scarani, V. (2009) “Recovering Part of the Quantum Boundary from Information Causality,” Physical Review A, 80: 040103; arXiv e-print quant-ph/0906.3464.
Aspect, A., Grangier, P., Roger, G. (1982) “Experimental Tests of Bell's Inequalities Using Time-Varying Analyzers,” Physical Review Letters, 49: 1804–1807.
Barnum, H. (2003) “Quantum Information Processing, Operational Quantum Logic, Convexity, and the Foundations of Physics,” Studies in the History and Philosophy of Modern Physics, 34: 343–379; arXiv e-print quant-ph/0304159.
Barnum, H., Barrett, J., Leifer, M., and Wilce, A. (2007) “A Generalized No-Broadcasting Theorem,” Physical Review Letters, 99:240501; arXiv e-print quant-ph/0707.0620.
Barnum, H., Dahlsten, O., Leifer, M., and Toner, B. (2008) “Nonclassicality Without Entanglement Enables Bit Committment,” Proceedings of the IEEE Information Theory Workshop (ITW 2008), Porto, May 5–9, 2008, Joao Barros and Steven W. McLauglin (eds.); arXiv e-print quant-ph/0803.1264.
Barrett, J. (2007) “Information Processing in Generalized Probabilistic Theories,” Physical Review A, 75:032304; arXiv e-print quant-ph/0508211.
Bell, J.S. (1964) “On the Einstein-Podolsky-Rosen Paradox” Physics, 1: 195–200.
Bennett, C.H., DiVicenzo, B.D. (2000) “Quantum Information and Computation,” Nature, 404: 247–255.
Bohr, N. (1935) “Can Quantum-Mechanical Description of Physical Reality be Considered Complete?,” Physical Reviewm 38: 696–702.
Born, M. (ed.)(1992) The Born-Einstein Letters, Dordrecht: Reidel.
Brassard, G. (2005) “Is Information the Key?,” Nature Physics, 1: 2–4 .
Brukner, C., Zeilinger, A. (2002) “Information and Fundamental Elements of the Structure of Quantum Theory,” Festschrift for C. F. von Weizsaecker on the Occasion of his 90th Birthday, arXiv e-print quant-ph/0212084.
Bub, J. (2006) “Quantum Information and Computation,” in John Earman and Jeremy Butterfield (eds.), Philosophy of Physics (Handbook of Philosophy of Science), Amsterdam: North Holland, pp. 555–660; arXiv e-print quant-ph/0512125.
Bub, J. (2007)“Quantum Computation from a Quantum Logical Perspective,” Quantum Information and Computation, 7: 281–296.
Bub, J. (2008) “Quantum Computation and Pseudotelepathic Games,” Philosophy of Science, 75 458–472.
Bub, J. (2010) “Quantum Computation: Where Does the Speed-Up Come from?,” in A. Bokulich and G. Jaeger (eds.), Philosophy of Quantum Information and Entanglement, Cambridge: Cambridge University Press, pp. 231–246.
Bub, J. (2010) “Quantum Probabilities: An Information-Theoretic Interpretation,” in Hartmann and C. Beisbart (eds.), Probabilities in Physics, Oxford: Oxford University Press.
Clifton, R., Bub, J., Halvorson, H. (2003) “Characterizing Quantum Theory in Terms of Information-Theoretic Constraints” Foundations of Physics, 33: 1561–1591.
Deutsch, D. (1985) “Quantum Theory, the Church-Turing Principle and the Universal Quantum Computer,” Proceedings of the Royal Society (London), A400: 97–117.
Deutsch, D. (1997) The Fabric of Reality, London: Penguin.
Dieks, D. (1982) “Communication by EPR Devices,” Physics Leters A, 92: 271–272.
Einstein, A., Podolsky, B., Rosen, N. (1935) “Can Quantum-Mechanical Description of Physical Reality be Considered Complete?,” Physical Review, 47: 777–780.
Everett, H. (1957) “'Relative State' Formulation of Quantum Mechanics,” Reviews of Modern Physics, 29: 454–462.
Feynman, R. (1996) Feynman Lectures on Computation, edited by J.G. Hey and R.W. Allen, Reading, MA: Addison-Wesley Publishing Company.
Fuchs, C.A. (2001) “Quantum Foundations in the Light of Quantum Information,” in Proceedings of the NATO Advanced Research Workshop on Decoherence and its Implications in Quantum Computation and Information Transfer, A. Gonis (ed.); arXiv e-print quant-ph/0106166.
Holevo, A.S. (1973) “Statistical Problems in Quantum Physics,” in G. Murayama and J.V. Prokhorov (eds.) Proceedings of the Second Japan-USSR Symposium on Probability Theory, Berlin: Springer, pp. 104–109.
Kent, A. (1999) “Unconditionally Secure Bit Commitment” Physical Review Letters, 83: 1447–1450.
Lo, H.-K., Chau, H.F. (1997) “Is Quantum Bit Commitment Really Possible?,” Physical Review Letters, 78: 3410–3413.
Lo, H.-K., Popescu, S., Spiller, T. (1998) Introduction to Quantum Computation and Information, Singapore: World Scientific.
Mayers, D. (1997) “Unconditionally Secure Quantum Bit Commitment is Impossible,” Physical Review Letters, 78: 3414–3417.
Navascues, M. and Wunderlich, H. (2009)“A Glance Beyond the Quantum Model,” Proceedings of the Royal Society A, November 2009, 1–10; arXiv e-print quant-ph/0907.0372.
Nielsen, M.A., Chuang, I.L. (2000) Quantum Computation and Quantum Information, Cambridge: Cambridge University Press.
Pawlowski, M., Patarek, T., Kaszlikowski, D., Scarani, V., Winter, A.,and Zukowski, M. (2009) “A New Physical Principle: Informaiton Causality,” Nature, 461: 1101; arXiv e-print quant-ph/0905.229.
Schrödinger, E. (1935) “Discussion of Probability Relations Between Separated Systems,,” Proceedings of the Cambridge Philosophical Society, 31: 555–563; 32 (1936): 446–451.
Schumacher, B. (1995) “Quantum Coding,” Physical Review A, 51: 2738–2747.
Shannon, C.E., Weaver, W. (1949) The Mathematical Theory of Communication, Urbana: University of Illinois Press.
Steane, A.M. (1998) “Quantum Computing,” Reports on Progress in Physics, 61: 117–173.
Steane, A.M. (2003) “A Quantum Computer Needs Only One Universe” Studies in History and Philosophy of Modern Physics, 34B: 469–478; arXiv e-print quant-ph/0003084.
Timpson, C.G. (2003) “On a Supposed Conceptual Inadequacy of the Shannon Information in Quantum Mechanics” Studies in History and Philosophy of Modern Physics, 33: 441–468; arXiv e-print quant-ph/0112178.
van Fraassen, B. (1982) “The Charybdis of Realism: Epistemological Implications of Bell's Inequality,” Synthese, 52: 25–38.
Wootters, W.K., Zurek, W.H. (1982) “A Single Quantum Cannot be Cloned,” Nature, 299: 802–803.

Other Internet Resources
Preprints


Barnum, H., Barrett, J., Leifer, M., and Wilce, A. (2008) “Teleportation in General Probabilistic Theories.”
Barnum, H., Barrett, J., Clark, L.O., Leifer, M., Spekkens, R., Stepanik, N., Wilce, A., and Wilke, R. (2009) “Entropy and Information Causality in General Probabilistic Theories.”
Barnum, H., and Wilce, A. (2008a) “Information Processing in Convex Operational Theories.”
Barnum, H., and Wilce, A. (2008b) “Ordered Linear Spaces and Categories as Frameworks for Information-Processing Characterizations of Quantum and Classical Theory.”
Fuchs, C.A. (2002) “Quantum Mechanics as Quantum Information (and Only a Little More).”
Skrzypczyk, P., Brunner, N., and Popescu, S. (2009) “Emergence of Quantum Correlations from Non-Locality Swapping.”
Timpson, C. (2000) “Quantum Information and the Foundations of Quantum Mechanics.”
van Dam, W. (2005) “Implausible Consequences of Superstrong Nonlocality.”

Other Resources


arXiv E-print Archive for Quantum Physics.
Todd Brun's Lecture Notes in Quantum Information Processing.
John Preskill's Course on Quantum Information and Computation.
Cambridge Centre for Quantum Computation.
Quantum Information Processing at Oxford University.
Home Pages of Researchers on Quantum Information


Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Tuesday, July 26, 2011

Drumming to a cooler quantum beat



Colourized micrograph of NIST's aluminium drum

Physicists in the US have developed a new method to cool tiny quantum drum, putting it into a long-lived quantum ground state. The tiny device could be used as a new method of storing quantum information or as a motion sensor. It could also help advance the field of quantum acoustics which explores the quantum nature of mechanical vibrations.

"We are currently just at the cusp of making engineered massive objects that obey the rules of quantum mechanics, which are normally observed only at the atomic scale," explains John Teufel, a research affiliate who designed the drum and carried out the experiment with his team at the National Institute of Standards and Technology (NIST) in Colarado, US.
Quantum percussions

NIST's superconducting circuit containing a micro-drum placed on a sapphire chip
Superconducting circuit

The micro-drum is a quantum-mechanical resonator made from aluminium. It is 100 nm thick and 15 µm wide and is incorporated into a superconducting cavity in the set-up. The team cools the drum to microkelvin temperatures, bringing it to its quantum ground state with its range of motion approaching zero. In other words, the amplitude of its "beats" approaches zero. The circuit is designed so that the drum motion can influence the microwaves inside an electromagnetic cavity. The researchers created strong interactions between microwave light oscillating at 7.5 GHz and the drum vibrating at radio frequencies of 11 MHz.
Cool tool

The researchers use a technique similar to laser cooling of atoms – sideband cooling – only they use microwaves instead of laser light. The microwaves can be used to measure and control the drum vibrations, and vice versa. What is different about the Teufel team's cooling technique is that the entire apparatus is mounted in a traditional cryostat before the sideband cooling is started. "This gives us a big head-start because most laser cooling experiments begin at room temperature. Here the combination of low temperature pre-cooling and microwave sideband cooling is the niche that gives us our advantage," explains Teufel. The cryogenic cooling reduces the drum energy to about 30 quanta. Sideband cooling then reduces the drum temperature from 20 mK to below 400 µK, steadily lowering the drum energy to just one-third of one quantum. This means that for two-thirds of the time the drum is in its ground state (with no quanta) and for about one-third of the time there is some energy.
Information and sensors

The drum motion will persist for hundreds of microseconds, much longer than recorded before. "Because of these long time scales we could potentially encode quantum information into the motion of the drum where it can be temporarily stored before using microwave light to retrieve it" explains Teufel. This would be very useful in quantum computing. He also wants to combine the new circuit with superconducting quantum bits to create and manipulate the motion of relatively large objects at quantum scales. "For me it is the fundamental level of this research that is most exciting," he claims.

For me it is the fundamental level of this research that is most exciting John Teufel, NIST

But on a more practical level, Teufel says that because the system measures the minute drum beats, it could also serve as a motion sensor – sensing minuscule position changes in quantum systems. He also points out that drum movements are universal and could easily be integrated into quantum circuits. "I would really like to exploit the 'quantumness' of this circuit," says Teufel.

The first engineered object to be coaxed into the quantum ground state was developed by researchers last year and won third place in the Physics World 2010 Breakthrough of the Year award. Compared to that first drum, the NIST drum has a higher quality factor, so it can hold a beat longer, and it beats at a much lower frequency.

The research has been published in a paper in Nature.
Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Quantum memory works at room temperature

How to store a photon at room temperature


A quantum memory for photons that works at room temperature has been created by physicists in the UK. The breakthrough could help researchers to develop a quantum repeater device that allow quantum information to be transmitted over long distances.

Quantum bits (or qubits) of information can be transmitted using photons and put to use in a number of applications, including cryptography. These schemes rely on the fact that photons can travel relatively long distances without interacting with their environment. This means that photon qubits are able, for example, to remain in entangled states with other qubits – something that is crucial for many quantum-information schemes.

However, the quantum state of a photon will be gradually changed (or degraded) due to scattering as it travels hundreds of kilometres in a medium such as air or an optical fibre. As a result, researchers are keen on developing quantum repeaters, which take in the degraded signal, store it briefly, and then re-emit a fresh signal. This way, says Ian Walmsley of the University of Oxford, "you can build up entanglement over much longer distances".
Difficult to repair

A quantum memory, which stores and re-emits photons, is the critical component of a quantum repeater. Those made so far in laboratories must be maintained at extremely cold temperatures or under vacuum conditions. They also only tend to work over very narrow wavelength ranges of light and store the qubit for very short periods of time. Walmsley and his colleagues argue that it isn't feasible to use such finicky systems in intercontinental quantum communication – these links will need to cross oceans and other remote areas, where it's difficult to send a repair person to fix a broken cryogenic or vacuum system.

Moreover, they should also absorb a broad range of frequencies of light and store data for periods much longer than the length of a signal pulse. Walmsley calls this combination a "key enabling step for building big networks". The broad range of frequencies means the memory can handle larger volumes of data, while a long storage time makes it easier to accumulate multiple photons with desired quantum states.

Working towards this goal, Walmsley and his team made a cloud of caesium atoms into a quantum memory that operates at an easy-to-achieve temperature of about 62 °C. Unlike previous quantum memories, the photons stored and re-emitted do not have to be tuned to a frequency that caesium electrons would like to absorb. Instead, a pulse from an infrared control laser converts the photon into a "spin wave", encoding it in the spins of the caesium electrons and nuclei.
Paint it black

Walmsley compares the cloud of caesium atoms to a pane of glass – transparent, so it allows the light through. The first laser paints the glass black in a sense, allowing it to absorb all the light that reaches it. However, instead of becoming dissipating as heat and as it would in the darkened glass, the light that passed into the caesium cloud is stored in the spin wave.

Up to 4 µs later, a second laser pulse converts the spin wave back into a photon and makes the caesium transparent to light again. The researchers say that the caesium's 30% efficiency in absorbing and re-emitting photons could increase with more energetic pulses from the control laser, while the storage time could be improved with better shielding from stray magnetic fields, which disturb the spins in the caesium atoms.

Even at 30% efficiency, Ben Buchler of the Australian National University in Canberra calls the device "a big deal" because it absorbs a wide band of photon frequencies. Due to Heisenberg's uncertainty principle, the ultra-short single-photon pulses from today's sources don't have well defined energies, so an immediately useful quantum memory must be able to absorb a wide range of frequencies – which Buchler says high-efficiency memories can't yet do.
Noise not a problem

Background noise, or extra photons generated in the caesium clouds that are unrelated to the signal photons, was a major concern for room-temperature memories. "People thought that if you started using room-temperature gases in storage mode, you'd just have a lot of noise," says Walmsley.

Temperatures near absolute zero suppress these extra photons other memories. But because the control and signal pulses in the Oxford team's set-up are far from caesium's favoured frequencies, the cloud was less susceptible to photon-producing excitations and the noise level remained small even at room temperature.

Hugues de Riedmatten of the Institute of Photonic Sciences in Barcelona, Spain, says that the researchers showed that the remaining noise is fundamental to the system, not caused by their set-up. If improvements cannot further reduce the noise, it will be challenging to maintain the integrity of the signal across a large, complex network, he explains.

Nevertheless, he says, "This approach is potentially very interesting because it may lead to a quantum memory for single photon qubits at room temperature, which would be a great achievement for quantum-information science."

The work is described in a paper to be published in Physical Review Letters and a preprint on arXiv.

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Friday, March 25, 2011

Quantum probe beats Heisenberg limit



A group of physicists in Spain has shown how to make a quantum measurement that overcomes a limit related to Werner Heisenberg's uncertainty principle. The researchers confirmed a theoretical prediction of how to beat the Heisenberg limit by using interacting photons to measure atomic spin, and they say that their approach could lead to more sensitive searches for the ripples in space–time known as gravitational waves and perhaps also to improved brain imaging.

The standard limit on the precision with which a quantum measurement can be carried out is due to the statistical error associated with counting discrete particles rather than continuous quantities. So, for example, when measuring the phase difference between the waves sent down two arms of an interferometer, the error in this quantity will scale with the square root of the total number of photons measured, N. Since the signal scales with N, the signal-to-noise ratio also scales in the same way. Or, put another way, the sensitivity of the measurement, which is the minimum signal that can be measured with a given level of noise, will scale with 1/N1/2.

It is possible to improve on this scaling, however, by entangling the photons, because this correlates what would otherwise be independent sources of noise from the individual particles. Such entanglement allows measurements to approach the so-called Heisenberg limit, which means that sensitivity scales with 1/N. Until recently it was thought that this scaling represented an absolute limit on the sensitivity of quantum measurements.
Caught in a trap

However, in 2007 a group led by Carlton Caves at the University of New Mexico in the US predicted that the Heisenberg limit could be beaten by introducing nonlinear interactions between the measuring particles. That prediction has now been shown to be true, thanks to an experiment carried out by Morgan Mitchell and colleagues at the Institute of Photonic Sciences at Barcelona. Mitchell's group fired laser pulses into a sample of ultracold rubidium atoms held in an optical trap and measured how the atoms' spin angular momentum caused the polarization axis of the photons to rotate.

In a linear measurement, each photon would interact separately with the atoms, resulting in a relatively weak signal. But what the researchers did was to carry out nonlinear measurements, ramping up the intensity of the laser pulses enough so that each photon, as well as registering the magnetic state of an atom also altered the electronic structure of that atom. This in turn left its mark on the polarization of the next photon, so amplifying the signal. "We have a signal that is not dependent just on the thing we are aiming at, but also on what we send in," explains team member Mario Napolitano.

According to Napolitano, it wasn't clear that a signal could in practice be amplified in this way because it was reckoned that the nonlinearity would increase the noise as well as the signal. But his team was able to tailor the nonlinearity accordingly, by concentrating the interaction between atoms and photons to a very tiny region of space and by very precisely tuning the frequency of the laser so that it was very well matched to the atoms’ electronic structure. Then by measuring the rotation in the photons' polarization using an interferometer, measuring the noise and measuring the number of photons, then repeating this process for different photon numbers, the researchers were able to show that the sensitivity scales with photon number better than the scaling of the Heisenberg limit. In fact, they achieved a sensitivity that scaled with 1/N3/2.
Clocks and brains could benefit

Napolitano is keen to point out that this result does not imply that the Heisenberg uncertainty principle is wrong, but rather it shows that we do not properly understand how to scale that principle up to multiple-particle systems. He also believes that the work could ultimately have significant practical applications, such as improving atomic clocks, given that such devices rely on interferometers. What's more, several research groups are investigating the possibility of measuring electrical changes in the brain by using light to probe the magnetic properties of atoms placed close to the brain, and the lastest work could enhance this technique.

Jonathan Dowling, a theoretical physicist at Louisiana State University in the US, says that the latest work could also help in the search for gravitational waves. Researchers hope to register gravitational waves' distortion of space time by measuring the difference in path length experienced by laser beams travelling in the two orthogonal pipes of an interferometer. Dowling says that if the American LIGO detector could operate with a sensitivity that scales as 1/N3/2 rather than as 1/N1/2 then either its sensitivity could be greatly increased or its laser power enormously reduced, which would avoid potential heating and deformation of the facilities' optics. "This opens up a whole new ball game in nonlinear interferometry," he adds.

However, Barry Sanders, a quantum physicist at the University of Calgary in Canada, urges caution. "The experiment demonstrates that the Heisenberg limit can be beaten in the real world," he says. "But practical applications are not likely in the near future because of the technical challenges that need to be overcome, especially noise. We are still exploring the basic physics of using quantum resources for precise measurements."

The research is published in Nature.
Ref: PhysicsWorld.com

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Tuesday, March 8, 2011

Quantum computers a step closer to reality

Quantum computers:

illustration of this article In recent years, quantum computers have lost some of their lustre. However, a new quantum algorithm, which shows how a quantum computer could be used to simulate a complex system of interacting particles, raises hopes that some of the barriers blocking the wider application of quantum computing could soon be solved.

The study, presented in the journal Nature, was partly supported by the EU through the QUERG ('Quantum entanglement and the renormalization group') and QUEVADIS ('Quantum engineering via dissipation') projects. QUERG clinched more than EUR 1.2 million from the European Research Council (ERC) under the Ideas Programme of the Seventh Framework Programme (FP7), while QUEVADIS has been allocated EUR 10 million under FP7's 'Information and communication technologies' Theme.


Quantum technology exploits the weird properties of matter at extremely small scales. Where a bit in a classical computer can represent either a '1' or a '0,' a quantum bit - or qubit - can represent '1' and '0' at the same time. Two qubits can represent four values simultaneously, three qubits eight, and so on.

Under the right circumstances, performing computations with quantum bits is the equivalent of carrying out multiple classical computations in parallel. But the right circumstances are much rarer than was first anticipated by scientists.

'The original motivation to build a quantum computer came from Richard Feynman, who imagined a machine capable of simulating generic quantum mechanical systems - a task that is believed to be intractable for classical computers,' the researchers write.

Over the past decade, quantum computers with some 12 or 16 qubits have been built in the laboratory; but quantum computation is such a young field, and the physics of it are so counterintuitive, that researchers are still developing the theoretical tools for thinking about it.

To better understand the physics of a quantum system of interacting particles, the researchers, from Austria, Canada and Germany, tried to work out how the changes a quantum system undergoes could be reproduced on a universal quantum computer. To do this, they looked for a quantum version of the classical Metropolis algorithm.

Named after the physicist Nicholas Metropolis, who was part of the group that came up with it, the Metropolis algorithm appeared in 1953 but didn't find practical use until the first computers arrived. The classical version of the Metropolis algorithm used stochastic maps that converged (over many iterations) to the equilibrium state.

For the quantum version of the Metropolis algorithm, the team used completely positive maps of probability amplitudes instead; although this did introduce a few problems along the way, notably the introduction of quantum phase transitions that may lead to inaccurate computations.

Nonetheless, the implementation of the new quantum algorithm could have far-reaching applications in the fields of chemistry, condensed matter and high energy physics, where until today the Schrödinger equation remains unsolved for complex systems of many interacting particles.

'Even though an implementation of this algorithm for full-scale quantum many-body problems may be out of reach with today's technological means, the algorithm is scalable to system sizes that are interesting for actual physical simulations,' claim the researchers.

Document of reference
:Temme, K., et al. (2011) Quantum Metropolis sampling. Nature 471: 87-90. DOI: 10.1038/nature09770.

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)

Thursday, February 24, 2011

Invisibility (IOP/Topic of the moment)

After another news story about a “Harry Potter invisibility cloak”, we take a look at the science behind metamaterials.
A metamaterial cloak bends light around an object
Science


Invisibility has long been employed in works of science fiction and fantasy, from “cloaking devices” on spaceships in the various Star Trek series to Harry Potter’s magic cloak. But physicists are beginning to think they can actually make devices with just these properties.

To achieve the feat of “cloaking” an object, they have developed what are known as “metamaterials”, some of which can bend electromagnetic radiation, such as light, around an object, giving the appearance that it isn’t there at all.

The first examples only worked with long-wavelength radiation such as microwaves.

One small device that made small objects invisible to near-infrared radiation and worked in three dimensions was unveiled by physicists from the UK and Germany earlier this year.

Its creators claimed there was nothing stopping them from scaling their invention up to hide larger objects from visible light – although others had pointed out a flaw in their design.

Now, researchers at Boston University and Tufts University claim that they have come up with an invisibility cloak that works within the terahertz band – the radiation between infrared and radio wavelengths – but could be modified to work with visible light. Intriguingly, it is made out of silk.

Metamaterials
Such invisibility cloaks rely on metamaterials, which are a class of material engineered to produce properties that don’t occur naturally.

Light is electromagnetic radiation, made up of perpendicular vibrations of electric and magnetic fields. Natural materials usually only affect the electric component – this is what is behind the optics that we’re all familiar with such as ordinary refraction.

But metamaterials can affect the magnetic component too, expanding the range of interactions that are possible.

The metamaterials used in attempts to make invisibility cloaks are made up of a lattice with the spacing between elements less than the wavelength of the light we wish to ‘bend’.

The silk-based cloak recently announced uses “split-ring resonators” – concentric pairs of rings with splits at opposite ends. 10,000 gold resonators were initially attached to a one-centimetre-square piece of silk.

As silk is not rejected by the human body, it is thought that they could be used to coat internal organs so that surgeons can easily see what lies behind them.
A metamaterial array – split-ring resonators mounted on fiberglass circuit board. Testing on an array like this showed negative refraction of microwave radiation



Superlens
Another use for metamaterials, potentially with greater scientific applications, is in building a superlens.

Ordinary lenses are restricted by their “diffraction limit”. As David R Smith of the University of California, San Diego explained in Physics World, this means that “the best resolution that is possible corresponds to about half of the incident wavelength of the light that is used to produce the image”.

In 2000, Sir John Pendry of Imperial College London suggested that a metamaterial with a negative refractive index might get around problems such as wave decay and allow imaging of objects only nanometers in size.

Among the first practical applications would likely be using metamaterial lenses to view live viruses and maybe even bits of DNA. In 2005, a thin slab of silver was used to image objects just 60nm across – just over one hundredth the size of a red blood cell.

Gathered by: Sh.Barzanjeh(shabirbarzanjeh@gmail.com)