Ludwig Boltzmann was criticized for his 1872 attempt to prove his H-theorem (that entropy always increases) by a dynamical analysis of molecular collisions. Josef Loschmidt and others pointed out that if the molecular velocities were to be reversed at an instant, Boltzmann's work would show that the entropy should decrease. This was the reversibility objection.
Entropy as Lost Information about Molecular Positions
Entropy increase can be easily understood as the loss of information as a system moves from an initially ordered state to a final disordered state. Ludwig Boltzmann was the first to describe entropy as "missing information."
Dr. Shannon's work roots back, as von Neumann has pointed out, to Boltzmann's observation, in some of his work on statistical physics (1894), that entropy is related to "missing information," inasmuch as it is related to the number of alternatives which remain possible to a physical system after all the macroscopically observable information concerning it has been recorded. L. Szilard (Zsch. f. Phys. Vol. 53, 1925) extended this idea to a general discussion of information in physics, and von Neumann (Math. Foundation of Quantum Mechanics, Berlin, 1932, Chap. V) treated information in quantum mechanics and particle physics.
(The Mathematical Theory of Information, Claude Shannon and Warren Weaver, p. 3n)
Although the physical dimensions of thermodynamic entropy (joules/ºK) are not the same as (dimensionless) mathematical information, apart from units they share the same famous formula.
S = ∑ pi ln pi
To see this very simply, let's consider the well-known example of a bottle of perfume in the corner of a room. We can represent the room as a grid of 64 squares. Suppose the air is filled with molecules moving randomly at room temperature (blue circles). In the lower left corner a small number of (red) perfume molecules will be released when we open the bottle (when you start the demonstration animation below).
What is the quantity of information we have about the perfume molecules? At the start we know their location in the lower left square, a bit less than 1/64th of the container. The quantity of information is determined by the minimum number of yes/no questions it takes to locate them. The best questions are those that split the locations evenly (a binary tree).
For example:
Are they in the upper half of the container? No.
Are they in the left half of the container? Yes.
Are they in the upper half of the lower left quadrant? No.
Are they in the left half of the lower left quadrant? Yes.
Are they in the upper half of the lower left octant? No.
Are they in the left half of the lower left octant? Yes.
Answers to these six optimized questions give us six bits of information for each molecule, locating it to 1/64th of the container. This is the amount of information that will be lost for each molecule if it is allowed to escape and diffuse fully into the room. The thermodynamic entropy increase is Boltzmann's constant k multiplied by the number of bits.
If the room had no air, the perfume would rapidly reach an equilibrium state, since the molecular velocity at room temperature is about 400 meters/second. Collisions with air molecules prevent the perfume from dissipating quickly. This lets us see the approach to equilibrium. When the perfume has diffused to one-sixteenth of the room, the entropy will have risen 2 bits for each molecule, to one-quarter of the room, four bits, etc.
Click here to start a computer visualization of the equilibration process in a new window.
Entropy as Evolution to the Most Probable Macrostate
In 1877, Boltzmann simply ignored classical dynamics and instead made the assumption that all phase space cells were equally probable. Classical dynamics could not prove that the path of the system in phase space would move through all the cells, let alone spend equal time in all cells. Boltzmann described a system he called "ergode," later called the canonical ensemble by J. Willard Gibbs. Equal a priori probabilities for all the phase space cells came to be called the ergodic hypothesis.
Paul and Tatiana Ehrenfest made the ergodic hypothesis the central question in statistical mechanics. Mathematicians took up the problem of ergodicity in continuous mathematics, which has questionable relevance for problems in discrete particle physics.
In modern quantum statistical mechanics, the same ergodic hypothesis (equiprobability of phase space cells) shows up in an assumption about transition probabilities between phase space cells. The transition probability for any microstate A to jump to microstate B is assumed to be the same as the reverse quantum jump from B to A.
The matrix element for the A - B transition is the complex conjugate of the reverse transition B - A. This is called Fermi's Golden Rule, although it was first derived by Paul Dirac.
We can see how any system with equal transition probabilities to and from any other state will quickly establish equilibrium populations. If 1000 systems are in state A and none in B, the early transitions will overwhelmingly be from A to B. An equal number of transitions back from B to A is not likely until the populations of A and B are about the same.
That is the basic idea behind the statistical formulation of Boltzmann's H-theorem. When all phase space cells are equally populated, the number of ways this can be achieved (the number of microstates) is at its maximum. Although cell populations will fluctuate away from this equilbrium condition, it corresponds to the maximum entropy.
Number of systems
Number of cycles
Transition Probability
The initial distribution of 500 systems in the upper left corner evolves rapidly to the normal distribution function for occupation numbers