How is the double slit experiment modeled in contemporary physical theories?

An important contemporary approach to the two-slit situation you ask about is decoherence. Many contributors to this forum are fans of decoherence, I am not, but it is very important and worth attention. I couldn't help noticing you still had queries about it even after the explanations of these «fans».

Since one should always be able to include more of the outside world in the box of quantum analysis, always be able to push the boundary out, you are basically asking if any progress has been made by including the slits, and detectors, in the unitary Quantum picture, or if instead any progress has been made by changing the unitary picture even a little. The decoherence approach does not change the unitarity of the evolution, and neither will QFT. (Some have wondered if gravity or other non-linearities will, but we will not go into that right now.)

Brief sketch of decoherence

For simplicity, assume there are only two ammeters behind the slits. As always with unitary evolutions, the electron, after being diffracted by the slits, and when it reaches the plane of the ammeters, is in a superposition of two states belonging to the different locations on that plane of the two ammeters (normally it will be in a superposition of many states but we can simplify here, too). $c_1\psi_1 + c_2\psi_2$. (We will simply neglect the occurrences where neither ammeter fires.) The state space is effectively C$^2$. Now, the ammeters are being modelled quantum-ly, too, so the set of two ammeters has a Hamiltonian and a Hilbert Space for the state vectors (wave functions) of the system of two ammeters, $H_{amm}$. The state space for the combined system of electron--being--measured and measurement apparatus (two ammeters) is then C$^2\otimes H_{amm}$. The decoherence approach makes more or less the same key assumption Mott, London, Wigner, and many others make down to today (but which I query, elsewhere, see my other posts and links), which is that whatever dictionary or correspondence there is between the macroscopic idea of «ammeter 1 fires» or, parallellly, «ammeter 2 fired», is to be modelled by a collection of quantum states in $H_{amm}$ (this is what I usually complain about, but not here), or, since we can pass to their closed span and simplify to assume there is one state, label them $\phi_1$ and $\phi_2$. (A key point in this regard will be expanded upon in a minute...) Now since this means that $\psi_1 \otimes \phi_o$ evolves unitarily to $\psi_1 \otimes \phi_1$ (Here, $\phi_o$ is the initial state of the ammeter-lattice while it waits to discharge or fire) but that $\psi_2 \otimes \phi_o$ would evolve to $\psi_2 \otimes \phi_2$, by linearity (no progress has been made, IMHO, by tinkering with the assumption of linearity), what actually happens is that our electron and ammeter as a combined system evolve to $$c_1 \psi_1 \otimes \phi_1 + c_2 \psi_2 \otimes \phi_2,$$ which is an entangled state: neither the ammeter nor the electron can be considered as a separate system anymore.

At this point, decoherence, depending on which flavour is administered, points out something undeniable: in fact, there are many more degrees of freedom within the ammeters: our simplification (which is the same as that of Wigner, EPR, and many others) overlooks something important. It is, according to this theory, important that the ammeters are at least weakly coupled to the environment. It is not quite a closed system, so the analysis above is only approximate.

All decoherence approaches (as far as I know) use density matrices. (I would be interested in a current reference to one that works only with pure states and not density matrices.) It can be shown in theory, rigorously, that this coupling with the environment leads to a further thermodynamic-like evolution to a density matrix which is very nearly diagonal and so can be regarded as a classical (or Bayesian) probability distribution on the two states, which are each obviously separable: $$\psi_1 \otimes \phi_1$$ and $$\psi_2 \otimes \phi_2.$$

The Coleman--Hepp model and Bell's response

In my opinion, the grand-daddy of all decoherence theories is the so-called Coleman--Hepp model. I learned it in Bell's famous paper, a freely available copy of it is here: http://www.mast.queensu.ca/~jjohnson/Bellagainst.html in which he tranlsates it from the language of QFT in C*algebras to the Schroedinger picture and Heisenberg picture. Not on Los Alamos archive. Of course Coleman and Hepp are two most distinguished physicists. Briefly, the criticism, which I agree with, is that a density matrix is not a state, so this is really no better from a logical or foundational point of view than the open system approach and suffers from the same question-begging. (Which is more than thiry years old now, so it's not progress.)

The Physics of these models

These models all use a kind of thermodynamics, and this is surely right, as Peter Morgan (hi peter) said in a previous post, the possible detections are thermodynamic events...and the way to make that precise is to take some sort of limit as the number of degrees of freedom goes to infinity. But none of these models reflects in the model that measurement is a kind of amplification, none of the thermodynamic limits involved uses negative temperature, and this is surely wrong. Feynman's opinion was that this was decisive, see Feynman and Hibbs, Quantum Mechanics and Path Integrals, New York, 1965, p. 22. So these models do not incorporate Feynamn's insight. The models of Balian et al. cited above do incorporate this, and study phase transitions induced by tiny disturbances from an unstable equilibrium which is indeed the physics of bubble chambers and photographic emulsions. Bohr thought that the apparatus had to be classical, and decoherence models do not use a limiting procedure that introduces a classical approximation. So they do not incorporate Bohr's insight.

There is experimental evidence that interaction with the environment induces decoherence, but not yet in a way relevant to measurement. There is also experimental evidence , the so-called spin echoes, that mesoscopic systems can recover their coherence after losing it, which is what Wigner always assumed.


Dear sklivvz, the very same question was asked a few days ago. Quantum field theory, string theory, or any other viable theory that may supersede quantum mechanics directly reduces to non-relativistic quantum mechanics in the non-relativistic limit and changes nothing about the basic postulates of quantum mechanics.

It means that you may find the corresponding low-energy, low-speed (multi-)particle states in the Hilbert space of QFT or string theory or anything else - essentially creation operators acting on the relevant vacuum - and you may prove that the QFT or stringy Hamiltonian acts on these states exactly as the non-relativistic Hamiltonian does, plus corrections that go like positive powers of $1/c$ and may be neglected in the non-relativistic limit.

So nothing changes about quantum mechanics and the double-slit experiment in QFT or string theory and it's likely that those things will never change. Cheers, LM


The answer is "yes and no". Important research is being done on modelling, with a Hamiltonian, the joint interaction between a microscopic particle being measured and the macroscopic measurement apparatus doing the measuring. Allahaverdyan, Balian, et al. have done the best, latest work. H.S. Green did a very stylised model long ago. Hannabuss has been doing important work on this.
Bibliographic information for their papers, and others by Collet, Milburn, et al., in Quantum Optics, there is also C. Gardiner and P. Zoller, Quantum Noise, can be found in the bibliography to my Thermodynamic Limits, Non-commutative Probability, and Quantum Entanglement, published, available for free at http://arxiv.org/abs/quant-ph/0507017 and my longer paper at http://arxiv.org/abs/0705.2554

That was the "yes" part. The cited authors are mainstream important researchers.

But. It doesn't do much to change the big picture.

In particular, it does not resolve the controversy about decoherence. The work of these authors is valid whether or not there is decoherence, nor do they address the question of what would produce it and when.
Instead, they address directly the measurement process as a unitary, quantum mechanical process. The electron is still entangled with the slits and ammeter and everything, but they can say something about the precise way in which that entanglement is "negligible for all practical purposes". So it is important work.

More precisely, to answer the last parts of your question, it is not even a good idea to think of the "particle" as sometimes being examined in a a wave picture and sometimes in a a particle mode. That is just sloppy undergraduate thinking, you won't find that in the axioms of QM and you won't find it in these papers. There is no "this switch between wave-like and particle-like models of the electron" in their careful analyses of the measurement process, nor in the axioms of QM. In the axioms there is a switch from using the unitary evolution axioms of the wave function to the axioms for a measurement process, but this has nothing to do with modelling the electron. These cited papers succeed in modelling the measurement process as a unitary deterministic evolution of a quantum combined system. The resulting entangled superposition of states is not the same thing as what the measurement axioms predict, because the entangled state is a superposition of quantum states and the "result of a measurement process" is supposed to be a probability distribution on separable states. But they can show that the difference between these two different things is practically negligible because the "coherence" in the superposition is very very low...If now at this step you, dear reader, take the point of view of the "decoherence" crowd and apply it and go a little further than these results do, you can then say that since the coherence is negligible, this superposition of entangled states will appear the same as a mixed state, and a mixed state is, « as we all know », a probability distribution on its different components.

So yes, progress has been made, but J.S. Bell would never have accepted going from a quantum superposition of states to even a diagonal density matrix since logically these are distinct conceptions, so the controversy continues.