Citation for this page in APA citation style.           Close


Core Concepts

Abduction
Belief
Best Explanation
Cause
Certainty
Chance
Coherence
Correspondence
Decoherence
Divided Line
Downward Causation
Emergence
Emergent Dualism
ERR
Identity Theory
Infinite Regress
Information
Intension/Extension
Intersubjectivism
Justification
Materialism
Meaning
Mental Causation
Multiple Realizability
Naturalism
Necessity
Possible Worlds
Postmodernism
Probability
Realism
Reductionism
Schrödinger's Cat
Supervenience
Truth
Universals

Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
James Martineau
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Teilhard de Chardin
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Gregory Bateson
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Donald Campbell
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Arthur Holly Compton
John Conway
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Albert Einstein
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Lila Gatlin
Michael Gazzaniga
GianCarlo Ghirardi
J. Willard Gibbs
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Hyman Hartman
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Art Hobson
Jesper Hoffmeyer
E. T. Jaynes
William Stanley Jevons
Roman Jakobson
Pascual Jordan
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
David Layzer
Joseph LeDoux
Benjamin Libet
Seth Lloyd
Hendrik Lorentz
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Emmy Noether
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Roger Penrose
Steven Pinker
Colin Pittendrigh
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Adolphe Quételet
Jürgen Renn
Juan Roederer
Jerome Rothstein
David Ruelle
Tilman Sauer
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Claude Shannon
Charles Sherrington
David Shiang
Herbert Simon
Dean Keith Simonton
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
William Thomson (Kelvin)
Giulio Tononi
Peter Tse
Vlatko Vedral
Heinz von Foerster
John von Neumann
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Stephen Wolfram
H. Dieter Zeh
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium

 
Decoherence
The "decoherence program" of H. Dieter Zeh, Erich Joos, Wojciech Zurek, John Wheeler, Max Tegmark, and others has multiple aims -
  1. to show how classical physics emerges from quantum physics. They call this the "quantum to classical transition."

  2. to explain the lack of macroscopic superpositions of quantum states (e.g., Schrödinger's Cat as a superposition of live and dead cats).

  3. in particular, to identify the mechanism that suppresses ("decoheres") interference between states as something involving the "environment" beyond the system and measuring apparatus.

  4. to explain the appearance of particles following paths (they say there are no "particles," and maybe no paths).

  5. to explain the appearance of discontinuous transitions between quantum states (there are no "quantum jumps" either)

  6. to champion a "universal wave function" (as a superposition of states) that evolves in a "unitary" fashion (i.e., deterministically) according to the Schrödinger equation.

  7. to clarify and perhaps solve the measurement problem, which they define as the lack of macroscopic superpositions.

  8. to explain the "arrow of time."

  9. to revise the foundations of quantum mechanics by changing some of its assumptions, notably challenging the "collapse" of the wave function or "projection postulate."

    Decoherence theorists say that they add no new elements to quantum mechanics (such as "hidden variables") but they do deny one of the three basic assumptions - namely Dirac's projection postulate. This is the method used to calculate the probabilities of various outcomes, which probabilities are confirmed to several significant figures by the statistics of large numbers of identically prepared experiments.

    They accept (even overemphasize) Dirac's principle of superposition. Some also accept the axiom of measurement, although some of them question the link between eigenstates and eigenvalues.

The decoherence program hopes to offer insights into several other important phenomena:

  1. What Zurek calls the "einselection" (environment-induced superselection) of preferred states (the so-called "pointer states") in a measurement apparatus.

  2. The role of the observer in quantum measurements.

  3. Nonlocality and quantum entanglement (which is used to "derive" decoherence).

  4. The origin of irreversibility (by "continuous monitoring").

  5. The approach to thermal equilibrium.

The decoherence program finds unacceptable these aspects of the standard quantum theory:

  1. Quantum "jumps" between energy eigenstates.

  2. The "apparent" collapse of the wave function.

  3. In particular, explanation of the collapse as a "mere" increase of information.

  4. The "appearance" of "particles."

  5. The "inconsistent" Copenhagen Interpretation - quantum "system," classical "apparatus."

  6. The "insufficient" Ehrenfest Theorems.

Decoherence theorists admit that some problems remain to be addressed:

  1. The "problem of outcomes." Without the collapse postulate, it is not clear how definite outcomes are to be explained.

As Tegmark and Wheeler put it:

The main motivation for introducing the notion of wave-function collapse had been to explain why experiments produced specific outcomes and not strange superpositions of outcomes...it is embarrassing that nobody has provided a testable deterministic equation specifying precisely when the mysterious collapse is supposed to occur.

Some of the controversial positions in decoherence theory, including the denial of collapses and particles, come straight from the work of Erwin Schrödinger, for example in his 1952 essays "Are There Quantum Jumps?" (Part I and Part II), where he denies the existence of "particles," claiming that everything can be understood as waves.

Other sources include: Hugh Everett III and his "relative state" or "many world" interpretations of quantum mechanics; Eugene Wigner's article on the problem of measurement; and John Bell's reprise of Schrödinger's arguments on quantum jumps.

Decoherence advocates therefore look to other attempts to formulate quantum mechanics. Also called "interpretations," these are more often reformulations, with different basic assumptions about the foundations of quantum mechanics. Most begin from the "universal" applicability of the unitary time evolution that results from the Schrödinger wave equation. They include:

  • The DeBroglie-Bohm "pilot-wave" or "hidden variables" formulation.
  • The Everett-DeWitt "relative-state" or "many worlds" formulation.
  • The Ghirardi-Rimini-Weber "spontaneous collapse" formulation.

Note that these "interpretations" are often in serious conflict with one another. Where Erwin Schrödinger thinks that waves alone can explain everything (there are no particles in his theory), David Bohm thinks that particles not only exist but that every particle has a definite position that is a "hidden parameter" of his theory. H. Dieter Zeh, the founder of decoherence, sees

one of two possibilities: a modification of the Schrödinger equation that explicitly describes a collapse (also called "spontaneous localization") or an Everett type interpretation, in which all measurement outcomes are assumed to exist in one formal superposition, but to be perceived separately as a consequence of their dynamical autonomy resulting from decoherence.
It was John Bell who called Everett's many-worlds picture "extravagant,"
While this latter suggestion has been called "extravagant" (as it requires myriads of co-existing quasi-classical "worlds"), it is similar in principle to the conventional (though nontrivial) assumption, made tacitly in all classical descriptions of observation, that consciousness is localized in certain semi-stable and sufficiently complex subsystems (such as human brains or parts thereof) of a much larger external world. Occam's razor, often applied to the "other worlds", is a dangerous instrument: philosophers of the past used it to deny the existence of the interior of stars or of the back side of the moon, for example. So it appears worth mentioning at this point that environmental decoherence, derived by tracing out unobserved variables from a universal wave function, readily describes precisely the apparently observed "quantum jumps" or "collapse events."

The Information Interpretation of quantum mechanics also has explanations for the measurement problem, the arrow of time, and the emergence of adequately, i.e., statistically determined classical objects. However, I-Phi does it while accepting the standard assumptions of orthodox quantum physics. See below.

We briefly review the standard theory of quantum mechanics and compare it to the "decoherence program," with a focus on the details of the measurement process. We divide measurement into several distinct steps, in order to clarify the supposed "measurement problem" (mostly the lack of macroscopic state superpositions) and perhaps "solve" it.


The most famous example of probability-amplitude-wave interference is the two-slit experiment. Interference is between the probability amplitudes whose absolute value squared gives us the probability of finding the particle at various locations behind the screen with the two slits in it.

Finding the particle at a specific location is said to be a "measurement."

In standard quantum theory, a measurement is made when the quantum system is "projected" or "collapsed" or "reduced" into a single one of the system's allowed states. If the system was "prepared" in one of these "eigenstates," then the measurement will find it in that state with probability one (that is, with certainty).

However, if the system is prepared in an arbitrary state ψa, it can be represented as being in a linear combination of the system's basic energy states φn.

ψa = Σ cn | n >.

where

cn = < ψa | φn >.

It is said to be in "superposition" of those basic states. The probability Pn of its being found in state φn is

Pn = < ψa | φn >2 = cn2 .

Between measurements, the time evolution of a quantum system in such a superposition of states is described by a unitary transformation U (t, t0) that preserves the same superposition of states as long as the system does not interact with another system, such as a measuring apparatus. As long as the quantum system is completely isolated from any external influences, it evolves continuously and deterministically in an exactly predictable (causal) manner.

Whenever the quantum system does interact however, with another particle or an external field, its behavior ceases to be causal and it evolves discontinuously and indeterministically. This acausal behavior is uniquely quantum mechanical. Nothing like it is possible in classical mechanics. Most attempts to "reinterpret" or "reformulate" quantum mechanics are attempts to eliminate this discontinuous acausal behavior and replace it with a deterministic process.

We must clarify what we mean by "the quantum system" and "it evolves" in the previous two paragraphs. This brings us to the mysterious notion of "wave-particle duality." In the wave picture, the "quantum system" refers to the deterministic time evolution of the complex probability amplitude or quantum state vector ψa, according to the "equation of motion" for the probability amplitude wave ψa, which is the Schrödinger equation,

δψa/δt = H ψa.

The probability amplitude looks like a wave and the Schrödinger equation is a wave equation. But the wave is an abstract quantity whose absolute square is the probability of finding a quantum particle somewhere. It is distinctly not the particle, whose exact position is unknowable while the quantum system is evolving deterministically. It is the probability amplitude wave that interferes with itself. Particles, as such, never interfere (although they may collide).

Note that we never "see" the superposition of particles in distinct states. There is no microscopic superposition in the sense of the macroscopic superposition of live and dead cats (See Schrödinger's Cat).

When the particle interacts, with the measurement apparatus for example, we always find the whole particle. It suddenly appears. For example, an electron "jumps" from one orbit to another, absorbing or emitting a discrete amount of energy (a photon). When a photon or electron is fired at the two slits, its appearance at the photographic plate is sudden and discontinuous. The probability wave instantaneously becomes concentrated at the location of the particle.

There is now unit probability (certainty) that the particle is located where we find it to be. This is described as the "collapse" of the wave function. Where the probability amplitude might have evolved under the unitary transformation of the Schrödinger equation to have significant non-zero values in a very large volume of phase space, all that probability suddenly "collapses" (faster than the speed of light, which deeply bothered Albert Einstein) to the location of the particle.

Einstein said that some mysterious "spooky action-at-a-distance" must act to prevent the appearance of a second particle at a distant point where a finite probability of appearing had existed just an instant earlier.

Animation of a wave function collapsing - click to restart

Whereas the abstract probability amplitude moves continuously and deterministically throughout space, the concrete particle moves discontinuously and indeterministically to a particular point in space.

For this collapse to be a "measurement," the new information about which location (or state) the system has collapsed into must be recorded somewhere in order for it to be "observable" by a scientist. But the vast majority of quantum events - e.g., particle collisions that change the particular states of quantum particles before and after the collision - do not leave an indelible record of their new states anywhere (except implicitly in the particles themselves).

We can imagine that a quantum system initially in state ψa has interacted with another system and as a result is in a new state φn, without any macroscopic apparatus around to record this new state for a "conscious observer."

H. D. Zeh describes how quantum systems may be "measured" without the recording of information.

It is therefore a plausible experimental result that the interference disappears also when the passage [of an electron through a slit] is "measured" without registration of a definite result. The latter may be assumed to have become a "classical fact" as soon as the measurement has irreversibly "occurred". A quantum phenomenon may thus "become a phenomenon" without being observed. This is in contrast to Heisenberg's remark about a trajectory coming into being by its observation, or a wave function describing "human knowledge". Bohr later spoke of objective irreversible events occurring in the counter. However, what precisely is an irreversible quantum event? According to Bohr this event can not be dynamically analyzed.

Analysis within the quantum mechanical formalism demonstrates nonetheless that the essential condition for this "decoherence" is that complete information about the passage is carried away in some objective physical form. This means that the state of the environment is now quantum correlated (entangled) with the relevant property of the system (such as a passage through a specific slit). This need not happen in a controllable way (as in a measurement): the "information" may as well form uncontrollable "noise", or anything else that is part of reality. In contrast to statistical correlations, quantum correlations characterize real (though nonlocal) quantum states - not any lack of information. In particular, they may describe individual physical properties, such as the non-additive total angular momentum J2 of a composite system at any distance.

The Measurement Process
In order to clarify the measurement process, we separate it into several distinct stages, as follows:
  • A particle collides with another microscopic particle or with a macroscopic object (which might be a measuring apparatus).

  • In this scattering problem, we ignore the internal details of the collision and say that the incoming initial state ψa has changed asymptotically (discontinuously, and randomly = wave-function collapse) into the new outgoing final state φn.

  • [Note that if we prepare a very large number of identical initial states ψa, the fraction of those ending up in the final state φn is just the probability < ψa | φn >2]

  • The information that the system was in state ψa has been lost (its path information has been erased; it is now "noise," as Zeh describes it). New information exists (implicitly in the particle, if not stored anywhere else) that the particle is in state φn.

  • If the collision is with a large enough (macroscopic) apparatus, it might be capable of recording the new system state information, by changing the quantum state of the apparatus into a "pointer state" correlated with the new system state.

    "Pointers" could include the precipitated silver-bromide molecules of a photographic emulsion, the condensed vapor of a Wilson cloud chamber, or the cascaded discharge of a particle detector.

  • But this new information will not be indelibly recorded unless the recording apparatus can transfer entropy away from the apparatus greater than the negative entropy equivalent of the new information (to satisfy the second law of thermodynamics). This is the second requirement in every two-step creation of new information in the universe.

  • The new information could be useful (it is negative entropy) to an information processing system, for example, a biological cell like a brain neuron.

    The collision of a sodium ion (Na+) with a sodium/potassium pump (an ion channel) in the cell wall could result in the sodium ion being transported outside the cell, resetting conditions for the next firing of the neuron's action potential, for example.

  • The new information could be meaningful to an information processing agent who could not only observe it but understand it. Now neurons would fire in the mind of the conscious observer that John von Neumann and Eugene Wigner thought was necessary for the measurement process to occur at all.

    Von Neumann (perhaps influenced by the mystical thoughts of Neils Bohr about mind and body as examples of his "complementarity.") saw three levels in a measurement;

    1. the system to be observed, including light up to the retina of the observer.
    2. the observer's retina, nerve tracts, and brain
    3. the observer's abstract "ego."

  • John Bell asked tongue-in-cheek whether no wave function could collapse until a scientist with a Ph.D. was there to observe it. He drew a famous diagram of what he called von Neumann's "shifty split."

    Bell shows that one could place the arbitrary "cut" (Heisenberg called it the "Schnitt") at various levels without making any difference.

    But an "objective" observer-independent measurement process ends when irreversible new information has been indelibly recorded (in the photographic plate of Bell's drawing).

    Von Neumann's physical and mental levels are better discussed as the mind-body problem, not the measurement problem.

The Measurement Problem

So what exactly is the "measurement problem?"

For decoherence theorists, the unitary transformation of the Schrödinger equation cannot alter a superposition of microscopic states. Why then, when microscopic states are time evolved into macroscopic ones, don't macroscopic superpositions emerge? According to H. D. Zeh:

Because of the dynamical superposition principle, an initial superposition
Σ cn | n > does not lead to definite pointer positions (with their empirically observed frequencies). If decoherence is neglected, one obtains their entangled superposition Σ cn | n > | Φn >, that is, a state that is different from all potential measurement outcomes.

And according to Erich Joos, another founder of decoherence:

It remains unexplained why macro-objects come only in narrow wave packets, even though the superposition principle allows far more "nonclassical" states (while micro-objects are usually found in energy eigenstates). Measurement-like processes would necessarily produce nonclassical macroscopic states as a consequence of the unitary Schrödinger dynamics. An example is the infamous Schrödinger cat, steered into a superposition of "alive" and "dead".

The fact that we don't see superpositions of macroscopic objects is the "measurement problem," according to Zeh and Joos.

An additional problem is that decoherence is a completely unitary process (Schrödinger dynamics) which implies time reversibility. What then do decoherence theorists see as the origin of irreversibility? Can we time reverse the decoherence process and see the quantum-to-classical transition reverse itself and recover the original coherent quantum world?

To "relocalize" the superposition of the original system, we need only have complete control over the environmental interaction. This is of course not practical, just as Ludwig Boltzmann found in the case of Josef Loschmidt's reversibility objection.

Does irreversibility in decoherence have the same rationale - "not possible for all practical purposes" - as in classical statistical mechanics?

According to more conventional thinkers, the measurement problem is the failure of the standard quantum mechanical formalism (Schrödinger equation) to completely describe the nonunitary "collapse" process. Since the collapse is irreducibly indeterministic, the time of the collapse is completely unpredictable and unknowable. Indeterministic quantum jumps are one of the defining characteristics of quantum mechanics, both the "old" quantum theory, where Bohr wanted radiation to be emitted and absorbed discontinuously when his atom jumpped between staionary states, and the modern standard theory with the Born-Jordan-Heisenberg-Dirac "projection postulate."

To add new terms to the Schrödinger equation in order to control the time of collapse is to misunderstand the irreducible chance at the heart of quantum mechanics, as first seen clearly, in 1917, by Albert Einstein. When he derived his A and B coefficients for the emission and absorption of radiation, he found that an outgoing light particle must impart momentum hν/c to the atom or molecule, but the direction of the momentum can not be predicted! Neither can the theory predict the time when the light quantum will be emitted.

Such a random time was not unknown to physics. When Ernest Rutherford derived the law for radioactive decay of unstable atomic nuclei in 1900, he could only give the probability of decay time. Einstein saw the connection with radiation emission:

It speaks in favor of the theory that the statistical law assumed for [spontaneous] emission is nothing but the Rutherford law of radioactive decay.
But the inability to predict both the time and direction of light particle emissions, said Einstein in 1917, is "a weakness in the theory..., that it leaves time and direction of elementary processes to chance (Zufall, ibid.)." It is only a weakness for Einstein, of course, because his God does not play dice. Decoherence theorists too appear to have what William James called an "antipathy to chance."


In the original "old" quantum mechanics, Neils Bohr made two assumptions. One was that atoms could only be found in what he called stationary energy states, later called eigenstates. The second was that the observed spectral lines were discontinuous sudden transitions of the atom between the states. The emission or absorption of quanta of light with energy equal to the energy difference between the states (or energy levels) with frequency ν was given by the formula

E2 - E1 = h ν,

where h is Planck's constant, derived from his radiation law that quantized the allowed values of energy.

In the now standard quantum theory, formulated by Werner Heisenberg, Max Born, Pascual Jordan, Erwin Schrödinger, Paul Dirac, and others, three foundational assumptions were made: the principle of superposition, the axiom of measurement, and the projection postulate. Since decoherence challenges some of these ideas, we review the standard definitions.

The Principle of Superposition
The fundamental equation of motion in quantum mechanics is Schrödinger's famous wave equation that describes the evolution in time of his wave function ψ,

i δψ/δt - Hψ.

For a single particle in idealized complete isolation, and for a Hamiltonian H that does not involve magnetic fields, the Schrödinger equation is a unitary transformation that is time-reversible (the principle of microscopic reversibility)

Max Born interpreted the square of the absolute value of Schrödinger's wave function as providing the probability of fi nding a quantum system in a certain state ψn.

The quantum (discrete) nature of physical systems results from there generally being a large number of solutions ψn (called eigenfunctions) of the Schrödinger equation in its time independent form, with energy eigenvalues En.

Hψn = Enψn,

The discrete energy eigenvalues En limit interactions (for example, with photons) to the energy di fferences En - Em, as assumed by Bohr. Eigenfunctions ψn are orthogonal to one another,

< ψn | ψm > = δnm,

where δnm is the Dirac delta-function, equal to 1 when n = m, and 0 otherwise. The sum of the diagonal terms in the matrix < ψn | ψm >, when n = m, must be normalized to 1 to be meaningful as Born rule probabilities.

Σ Pn = Σ < ψn | ψn >2 = 1.

The off-diagonal terms in the matrix, < ψn | ψm >, are interpretable as interference terms. When the matrix is used to calculate the expectation values of some quantum mechanical operator O, the off-diagonal terms < ψn | O | ψm > are interpretable as transition probabilities - the likelihood that the operator O will induce a transition from state ψn to ψm.

The Schrödinger equation is a linear equation. It has no quadratic or higher power terms, and this introduces a profound - and for many scientists and philosophers a disturbing - feature of quantum mechanics, one that is impossible in classical physics, namely the principle of superposition of quantum states. If ψa and ψb are both solutions of the equation, then an arbitrary linear combination of these, ψ = caψa + cbψb; with complex coefficients ca and cb, is also a solution.

Together with Born's probabilistic interpretation of the wave function, the principle of superposition accounts for the major mysteries of quantum theory, some of which we hope to resolve, or at least reduce, with an objective (observer-independent) explanation of information creation during quantum processes (which can often be interpreted as measurements).

The Axiom of Measurement

The axiom of measurement depends on the idea of "observables," physical quantities that can be measured in experiments. A physical observable is represented as a Hermitean operator A that is self-adjoint (equal to its complex conjugate, A* = A). The diagonal elements
< ψn | A | ψn > of the operator's matrix are interpreted as giving the expectation value for An (when we make a measurement). The off -diagonal n, m elements describe the uniquely quantum property of interference between wave functions and provide a measure of the probabilities for transitions between states n and m.

It is these intrinsic quantum probabilities that provide the ultimate source of indeterminism, and consequently of irreducible irreversibility, as we shall see. The axiom of measurement is then that a large number of measurements of the observable A, known to have eigenvalues An, will result in the number of measurements with value An being proportional to the probability of finding the system in eigenstate ψn with eigenvalue An.

The Projection Postulate

The third novel idea of quantum theory is often considered the most radical. It has certainly produced some of the most radical ideas ever to appear in physics, in attempts to deny it (as the decoherence program appears to do, as do also Everett relative-state interpretations, many worlds theories, and Bohm-de Broglie pilot waves). The projection postulate is actually very simple, and arguably intuitive as well. It says that when a measurement is made, the system of interest will be found in one of the possible eigenstates of the measured observable.

We have several possible alternatives for eigenvalues. Measurement simply makes one of these actual, and it does so, said Max Born, in proportion to the absolute square of the probability amplitude wave function ψn. In this way, ontological chance enters physics, and it is partly this fact of quantum randomness that bothered Albert Einstein ("God does not play dice") and Schrödinger (whose equation of motion is deterministic).

When Einstein derived the expressions for the probabilities of emission and absorption of photons in 1917, he lamented that the theory seemed to indicate that the direction of an emitted photon was a matter of pure chance (Zufall), and that the time of emission was also statistical and random, just as Rutherford had found for the time of decay of a radioactive nucleus. Einstein called it a "weakness in the theory."

What Decoherence Gets Right

Allowing the environment to interact with a quantum system, for example by the scattering of low-energy thermal photons or high-energy cosmic rays, or by collisions with air molecules, surely will suppress quantum interference in an otherwise isolated experiment. But this is because large numbers of uncorrelated (incoherent) quantum events will "average out" and mask the quantum phenomena. It does not mean that wave functions are not collapsing. They are, at every particle interaction.

Decoherence advocates describe the environmental interaction as "monitoring" of the system by continuous "measurements."

Decoherence theorists are correct that every collision between particles entangles their wave functions, at least for the short time before decoherence suppresses any coherent interference effects of that entanglement.

But in what sense is a collision a "measurement." At best, it is a "pre-measurement."
It changes the information present in the wave functions before the collision. But the new information may not be recorded anywhere (other than being implicit in the state of the system).

All interactions change the state of a system of interest, but not all leave the "pointer state" of some measuring apparatus with new information about the state of the system.

So environmental monitoring, in the form of continuous collisions by other particles, is changing the specific information content of both the system, the environment, and a measuring apparatus (if there is one). But if there is no recording of new information (negative entropy created locally), the system and the environment may be in thermodynamic equilibrium.

Equilibrium does not mean that decoherence monitoring of every particle is not continuing.
It is. There is no such thing as a "closed system." Environmental interaction is always present.

If a gas of particles is not already in equilibrium, they may be approaching thermal equilibrium. This happens when any non-equilibrium initial conditions (Zeh calls these a "conspiracy") are being "forgotten" by erasure of path information during collisions. Information about initial conditions is implicit in the paths of all the particles. This means that, in principle, the paths could be reversed to return to the initial, lower entropy, conditions (Loschmidt paradox).

Erasure of path information could be caused by quantum particle-particle scattering (our standard view) or by decoherence "monitoring." How are these two related?

The Two Steps Needed in a Measurement that Creates New Information
More than the assumed collapse of the wave function (von Neumann's Process 1, Pauli's measurement of the first kind) is needed. Indelibly recorded information, available for "observations" by a scientist, must also satisfy the second requirement for the creation of new information in the universe.

Everything created since the origin of the universe over ten billion years ago has involved just two fundamental physical processes that combine to form the core of all creative processes. These two steps occur whenever even a single bit of new information is created and survives in the universe.

  • Step 1: A quantum process - the "collapse of the wave function."

    The formation of even a single bit of information that did not previously exist requires the equivalent of a "measurement." This "measurement" does not involve a "measurer," an experimenter or observer. It happens when the probabilistic wave function that describes the possible outcomes of a measurement "collapses" and a eigenstate of a matter or energy particle is actually changed.

    If the probability amplitude wave function did not collapse, unitary evolution would simply preserve the initial information.

  • Step 2: A thermodynamic process - local reduction, but cosmic increase, in the entropy.

    The second law of thermodynamics requires that the overall cosmic entropy always increases. When new information is created locally in step 1, some energy (with positive entropy greater than the negative entropy of the new information) must be transferred away from the location of the new bits or they will be destroyed, if local thermodynamical equilibrium is restored. This can only happen in a locality where flows of matter and energy with low entropy are passing through, keeping it far from equilibrium.

This two-step core creative process underlies the formation of microscopic objects like atoms and molecules, as well as macroscopic objects like galaxies, stars, and planets.

With the emergence of teleonomic (purposive) information in self-replicating systems, the same core process underlies all biological creation. But now some random changes in information structures are rejected by natural selection, while others reproduce successfully.

Finally, with the emergence of self-aware organisms and the creation of extra-biological information stored in the environment, the same information-generating core process underlies communication, consciousness, free will, and creativity.

The two physical processes in the creative process, quantum physics and thermodynamics, are somewhat daunting subjects for philosophers, and even for many scientists, including decoherence advocates.

Quantum Level Interactions Do Not Create Lasting Information

The overwhelming number of collisions of microscopic particles like electrons, photons, atoms, molecules, etc, do not result in observable information about the collisions. The lack of observations and observers does not mean that there have been no "collapses" of wave functions. The idea that the time evolution of the deterministic Schrödinger equation continues forever in a unitary transformation that leaves the wave function of the whole universe undecided and in principle reversible at any time, is an absurd and unjustified extrapolation from the behavior of the ideal case of a single perfectly isolated particle.

The principle of microscopic reversibility applies only to such an isolated particle, something unrealizable in nature, as the decoherence advocates know with their addition of environmental "monitoring." Experimental physicists can isolate systems from the environment enough to "see" the quantum interference (but again, only in the statistical results of large numbers of identical experiments).

The Emergence of the Classical World
In the standard quantum view, the emergence of macroscopic objects with classical behavior arises statistically for two reasons involving large numbers:
  1. The law of large numbers (from probability and statistics)

    • When a large number of material particles is aggregated, properties emerge that are not seen in individual microscopic particles. These properties include ponderable mass, solidity, classical laws of motion, gravity orbits, etc.
    • When a large number of quanta of energy (photons) are aggregated, properties emerge that are not seen in individual light quanta. These properties include continuous radiation fields with wavelike interference.

  2. The law of large quantum numbers (Bohr Correspondence Principle).
Decoherence as "Interpreted" by Standard Quantum Mechanics
Can we explain the following in terms of standard quantum mechanics?
  1. the decoherence of quantum interference effects by the environment

  2. the measurement problem, viz., the absence of macroscopic superpositions of states

  3. the emergence of "classical" adequately determined macroscopic objects

  4. the logical compatibility and consistency of two dynamical laws - the unitary transformation and the "collapse" of the wave function

  5. the entanglement of "distant" particles and the appearance of "nonlocal" effects such as those in the Einstein-Podolsky-Rosen experiment

Let's consider these point by point.

  1. The standard explanation for the decoherence of quantum interference effects by the environment is that when a quantum system interacts with the very large number of quantum systems in a macroscopic object, the averaging over independent phases cancels out (decoheres) coherent interference effects.

  2. In order to study interference effects, a quantum system is isolated from the environment as much as possible. Even then, note that microscopic interference is never "seen" directly by an observer. It is inferred from probabilistic theories that explain the statistical results of many identical experiments. Individual particles are never "seen" as superpositions of particles in different states. When a particle is seen, it is always the whole particle and nothing but the particle. The absence of macroscopic superpositions of states, such as the infamous linear superposition of live and dead Schrödinger Cats, is therefore no surprise.

  3. The standard quantum-mechanical explanation for the emergence of "classical" adequately determined macroscopic objects is that they result from a combination of a) Bohr's correspondence principle in the case of large quantum numbers. together with b) the familiar law of large numbers in probability theory, and c) the averaging over the phases described in point 1. Heisenberg indeterminacy relations still apply, but the individual particles' indeterminacies average out, and the remaining macroscopic indeterminacy is practically unmeasurable.

  4. Perhaps the two dynamical laws would be inconsistent if applied to the same thing at exactly the same time. But the "collapse" of the wave function (von Neumann's Process 1, Pauli's measurement of the first kind) and the unitary transformation that describes the deterministic evolution of the probability amplitude wave function (von Neumann's Process 2) are used in a temporal sequence.

    When you hear or read that electrons are both waves and particles, think "either-or" -
    first a wave of possibilities, then an actual particle.
    The first process describes what happens when quantum systems interact, in a collision or a measurement, when they become indeterministically entangled. The second then describes their deterministic evolution (while isolated) along their mean free paths to the next collision or interaction. One dynamical law applies to the particle picture, the other to the wave picture.

  5. The paradoxical appearance of nonlocal "influences" of one particle on an entangled distant particle, at velocities greater than light speed, are a consequence of a poor understanding of both the wave and particle aspects of quantum systems. The confusion usually begins with a statement such as "consider a particle A here and a distant particle B there." When entangled in a two-particle probability amplitude wave function, the two identical particles are "neither here nor there," just as the single particle in a two-slit experiment does not "go through" the slits.

    It is the single-particle probability amplitude wave that must "go through" both slits if it is to interfere. For a two-particle probability amplitude wave that starts its deterministic time evolution when the two identical particles are produced, it is only the probability of finding the particles that evolves according to the unitary transformation of the Schrödinger wave equation. It says nothing about where the particles "are."

    Now if and when a particle is measured somewhere, we can then label it particle A. Conservation of energy and momentum tell us immediately that the other identical particle is now symmetrically located on the other side of the central source of particles. If the particles are electrons (as in David Bohm's version of EPR), conservation of spin tells us that the now distant particle B must have its spin opposite to that of particle A is they were produced with a total spin of zero.

    Nothing is sent from particle A to B. The deduced properties are the consequence of conservation laws that are true for much deeper reasons than the puzzles of nonlocal entanglement. The mysterious instantaneous values for the properties is exactly the same mystery that bothered Einstein about a single-particle wave function having values all over a photographic screen at one instant, then having values only at the position of the located particle in the next instant, apparently violating special relativity.

    Animation of a two-particle wave function collapsing - click to restart

Compare the collapse of the two-particle probability amplitude above to the single-particle collapse here.

To summarize: Decoherence by interactions with environment can be explained perfectly by multiple "collapses" of the probability amplitude wave function during interactions with environment particles. Microscopic interference is never "seen" directly by an observer, therefore we do not expect ever to "see" macroscopic superpositions of live and dead cats. The "transition from quantum to classical" systems is the consequence of laws of large numbers. The quantum dynamical laws necessarily include two phases, one needed to describe the continuous deterministic motions of probability amplitude waves and the other the discontinuous indeterministic motions of physical particles. The mysteries of nonlocality and entanglement are no different from those of standard quantum mechanics as seen in the two-slit experiment. It is just that we now have two identical particles and their wave functions are nonseparable .

For Teachers
The Role of Decoherence in Quantum Mechanics, Stanford Encyclopedia of Philosophy
For Scholars

Chapter 3.7 - The Ergod Chapter 4.2 - The History of the Knowledge Problem
Part Three - Value Part Five - Problems
Normal | Teacher | Scholar