The Decoherence Interpretation of Quantum Mechanics

from “The New Quantum Universe” by Hey and Walters (2009)

A less extravagant (than the Copenhagen and Many Worlds Interpretations) and rather more mundane attempt to solve the measurement problem goes by the name of “decoherence”. This approach argues that quantum systems can never be totally isolated from the larger environment and that Schrodinger’s equation must be applied not only to the quantum system but also to the coupled quantum environment. In real life, the “coherence” of a quantum state – the delicate phase relations between the different parts of a quantum superposition – is rapidly affected by interactions with the rest of the world outside the quantum system. Wojciech Zurek is one of the most prominent advocates of this “decoherence” approach to the measurement problem, and he speaks of the quantum coherence as “leaking out “ into the environment. Zurek claims that recent years have seen a growing consensus that it is interactions of quantum systems with the environment that randomize the phases of quantum superpositions. All we have left is an ordinary non-quantum choice between states with classical probabilities and no funny interference effects. This seems a very prosaic end to the quantum measurement problem! How does this come about? Does decoherence by the environment really supply an answer to all the problems? Let us look at an experiment that claims to see decoherence of “Schrodinger cat” states in action.

Serge Haroche and Jean-Michel Raimond, working in Paris with their research group, have recently performed some exciting experiments that give support to this decoherence picture. There are three different parts to an experiment that can all interact – the quantum system, the “classical” measurement apparatus, and the environment. In their experiment the quantum system consists of an atom that can be prepared in one of two states. They measure the quantum state of the atom by injecting the atom into a cavity and using the electromagnetic field of the “cavity” as a classical “pointer.” What happens if we prepare the atom in a quantum superposition of the two states? If we treat the cavity as a second quantum system in its own right, we find that the supposedly classical counter is now predicted to be in a “Schrödinger cat” state. – a quantum superposition of two classical states of the pointer. Schrodinger’s thought experiment just highlighted the peculiarity of this situation by using his cat as a classical pointer. How do we escape from this apparent paradox? According to the decoherence picture, we must include the unavoidable coupling of the pointer to the environment. The pointer – or cavity – is under a constant bombardment from random photons, air molecules and so on that constitute the “environment.” Models of this random process as a third quantum system show that all phase information between the two original atomic states with their corresponding pointer positions is very rapidly lost. For the usual classical pointer fields with many photons, this decoherence is predicted to take place in an immeasurably short time. Remarkably, by using pointer cavity fields consisting of only a few photons, Haroche and Raimond have been able to observe and measure the decoherence time of this system. They do this by sending a second atom into the cavity at varying times after the first atom and measuring interference effects that depend on the continued coherence of the wavefunction of the first atom. By observing how fast these interference effects fall off with the time delay between the traversals through the cavity of the first and second atoms, they claim to have “caught decoherence in the act”!

Einstein’s problem with the Moon can be “explained” by using a similar decoherence argument. The Moon is not an inert system – not only are its individual molecules constantly interacting with their neighbors but also its surface is under constant bombardment by particles and radiation, mainly from the Sun. The coherence of any Schrodinger cat state involving the Moon would rapidly be destroyed by these constant interactions. According to such decoherence arguments, we can rest assured that the Moon is really there after all, even when we are not looking at. Bombardment by solar photons is enough to constitute a measurement and to destroy any quantum coherence.

Would these decoherence arguments have satisfied John Bell as an explanation of the measurement problem? Probably not! We have described not only the quantum system under observation but also the measuring apparatus as a quantum system. The quantum wavefunction for the combined system will be in a superposition of states corresponding to different classical states of the measuring apparatus, as in the experiment of Haroche and Raimond. The decoherence argument says we must include the environment as a third quantum system interacting with our measuring apparatus. As a result, phase randomization rapidly sets in and the quantum superposition is effectively reduced to a sum of different possible outcomes with classical probabilities. Bell had two problems with this approach. Firstly, all quantum states – for system, measuring apparatus, and environment – evolve according to the Schrödinger equation. It is mathematically impossible for such evolution to turn a coherent quantum superposition into an incoherent probabilistic sum. Although it is certainly true that the particular measurements one usually chooses to make display little or no quantum coherence, Bell argues that there is nothing “in principle” to stop us considering different types of measurements for which this will not be true. As Bell has said:

“So long as nothing, in principle, forbids consideration of such arbitrarily complicated observables, it is not permitted to speak of wave packet reduction. While for any given observable one can find a time for which the unwanted interference is as small as you like, for any given time one can find an observable for which it is as big as you do ‘not’ like.”

In Bell’s view, any mechanism for the collapse should also be applicable to small systems and should not be dependent on “the laws of large numbers.” His second problem, concerned the actual measurement itself. Even if one accepts that decoherence reduces the problem to a probabilistic choice between outcomes, nowhere does decoherence say how any particular outcome is achieved. Bell did not disagree about the practicality of measurements in quantum mechanics, but he felt strongly that unless we know “exactly when and how it [wavefunction reduction] takes over from the Schrödinger equation, we do not have an exact and unambiguous formulation of our most fundamental physical theory.”

## 10 comments:

Hi Steve,

You know what I think, which is decoherence being the right idea yet applied to the wrong theory respective of it ontological foundations. This would have been the same perspective Bell would have had which for some reason the authors seems to have either ignored or was ignorant of it. In either case it indicates to me they truly don’t have a full grasp of the subject.

Best,

Phil

Interesting. I think biology follows this view. A living body is in a oherent state, and we need the deoherence as interactions with the surroundings to stay grounded, here and now. The living state is like a new equilibrium state, governed by the entropy, like an allostatic phase, seen in addiction as instance. Homeostasis tries to bring down the entropy, environmental noise increase the tension.

What exactly makes the interaction from environment? It is not just em-force.

Thanks for the third quantum state/well.

Note, coherence requires a many-degrees-of freedom state. Coherence can also be created as a new state.

An interesting guy. http://www.cirs.net/researchers/researchers.php?id=811

as instance L.Davidovich, N.Zagury, M.Brune, J.M.Raimond and S.Haroche, "Teleportation of an atomic state between two cavities using nonlocal microwave fields", Phys.Rev.A 50, R895 (1994).

http://www.nature.com/nature/journal/v455/n7212/abs/nature07288.html

As per the argument quoted here in particular: for example, we find this:

"As a result, phase randomization rapidly sets in and the quantum superposition is effectively reduced to a sum of different possible outcomes with classical probabilities."

Well, *no*. The quantum superposition is only reduced to the disordered wave distribution that *would show* classical probabilities, if something can intervene to make any kind of probabilities at all out of whatever kind of waves we start with. Those possible outcomes (plural), however disordered are the configurations representing them, still need something *further* to be picked out. See the sneak into the circular argument up there?

Interference just lets us *proves* that we are dealing with waves states. It not being available in a given case wouldn't by itself lead to conversion of "distributions" into discrete localizations or outcomes.

And, if both states continue as in MWI, that violates conservation laws since (surprise!) each wave packet also represents a given amount of mass-energy, not just some abstract notion of location per se etc.

BTW, I need to read the original but it looks like Serge Haroche and Jean-Michel Raimond are just measuring the decoherence itself, not showing or arguing why the decoherence somehow makes all but one of the superposed states go away. They still have to measure to find that, which of course is just intrusion of a "measurement device/process" all over again. The problem is how to interpret their result, no? Well, I have a specific way to recover the elements of superpositon *after* supposed decoherence, which puts DI out to pasture.

Well, I think I need to break up my giant "original comment" which keeps failing here. So:

Many of the participants here know that I strongly disapprove of the idea that decoherence actually reduces the states to a mix of the alternatives. (Decoherence: dephasing of coherent phase relations between quantum states, each of which represents a possible outcome of measurement - but it's not that simple anyway.)

As I have said, more later: the DI is basically a circular argument that never justifies the probabilistic "classical behavior" from the disordered wave states. Think: we have nice "orderly waves" coming from a double slit, which make a varying amplitude pattern on a screen. If we could see or measure amplitudes directly, then we'd literally see the pattern as such, not in terms of "hits" when say one atom grabs up some energy. There's nothing intrinsic to explain why they don't just stay "waves" in that distribution, as would e.g. classical EM waves.

Well, then we introduce decoherence by e.g. varying phase change at one of the slits (akin to what the "confuser" does in my experiment [reminded to send JLN a copy, delayed since I stewed over abstract etc. sorry.] Well, that means that the wave amplitude distribution at the screen is disorderly - now, it doesn't make those classic rippled variations called "interference." But there's nothing logically intrinsic to that condition, that undoes it's being *a spread out region of amplitudes* and makes it localize, the mass-energy of an entire previously localized particle being "right here" and not elsewhere or "all over the place."

...

The DI cheats by having classical probabilities already put in "by hand" (if you know what that phrase implies), so sure you can find classical *probabilities* when changing conditions change the probabilities. But the problem was, moving from coherent or incoherent "waves" (amplitudes spread out over space, of whatever relationship) versus coherent-type statistics or classical statistics. IOW, the big division and problem is between "amplitudes" versus "statistics" per se, not "orderly" versus "disorderly" waves and the supposed connection of either to statistics (and mathematically, waves do *not* have any intrinsic connection to statistics, one reason why MWI in concert with DI has so much trouble *explaining* the Born rule probabilities (as opposed to simply having them in hand as fodder for the circular argument.)

Just to see the big contradiction: if decoherence is why we can see "classical outcomes" like one detector firing and not both, then how come we still see specific localizations when the states *are* still coherent - as in the quantum operation of the DS experiment, the interference pattern is still built up out of the distribution of discrete hits anyway! The whole DI is a category mistake, a semantic confusion, it isn't even a legitimate "worthy effort."

But of course, logical arguments aren't as good in physics as real experiments, so I proposed a way to test DI. It pits the predictions of standard QM (follow wave amplitudes all the way until some final detectors must reduce to BR stats) against the DI claim, as in Chad Orzel's illustration, that decoherence reduces coherent superpositions into statistical mixtures.

See my name linked blog "Paradoxer", the FQXi article linked from there; and we await the appearance of a more concise and improved piece in "The Quantum Times."

Neil, thanks for all your great comments.

Post a Comment