E. T. Jaynes
Edwin Thompson Jaynes extended statistical mechanics to connect it to probability theory, Claude Shannon's information theory, and Bayesian statistical inferences. He championed the work of J. Willard Gibbs, contrasting it to the earlier work of Ludwig Boltzmann. His 1957 "principle of maximum entropy" says that the probability distribution that best represents the current state of knowledge is the one with largest entropy. In 1964, Jaynes examined the difference between the Boltzmann and Gibbs formulations of the entropy. They differ, he says, because of different treatments of "interparticle forces."Normal | Teacher | Scholar
The status of the Gibbs and Boltzmann expressions for entropy has been a matter of some confusion in the literature. We show that: (1) the Gibbs H-function yields the correct entropy as defined in phenomenological thermodynamics; (2) the Boltzmann H yields an "entropy" that is in error by a nonnegligible amount whenever interparticle forces affect thermodynamic properties; (3) Boltzmann's other interpretation of entropy, S = k log W, is consistent with the Gibbs H, and derivable from it; (4) the Boltzmann H theorem does not constitute a demonstration of the second law for dilute gases; (5) the dynamical invariance of the Gibbs H gives a simple proof of the second law for arbitrary interparticle forces; (6) the second law is a special case of a general requirement for any macroscopic process to be experimentally reproducible. Finally, the "anthropomorphic" nature of entropy, on both the statistical and phenomenological levels, is stressed.Jaynes explains that Gibbs entropy is a conserved quantity, for the same reason as the Louiville theorem that conserves the hyper-volume in phase space of a cloud of particles as it traverses its trajectory. Boltzmann entropy increases. We can show that this is a consequence of quantal interactions during particle collisions, which deny the claim of microscopic irreversibility and erase the path information in the gas particles that would be needed to support Loschmidt's objection to the Boltzmann H-Theorem
In the writer's 1962 Brandeis lectures on statistical mechanics, the Gibbs and Boltzmann expressions for entropy were compared briefly, and it was stated that the Gibbs formula gives the correct entropy, as defined in phenomenological thermodynamics, while the Boltzmann H expression is correct only in the case of an ideal gas. However, there is a school of thought which holds that the Boltzmann expression is directly related to the entropy, and the Gibbs' one simply erroneous. This belief can be traced back to the famous Ehrenfest review article, which severely criticized Gibbs' methods. While it takes very little thought to see that objections to the Gibbs II are immediately refuted by the fact that the Gibbs canonical ensemble does yield correct thermodynamic predictions, discussion with a number of physicists has disclosed a more subtle, but more widespread, misconception. The basic inequality of the Gibbs and Boltzmann H functions, to be derived in Sec. II, was accepted as mathematically correct; but it was thought that, in consequence of the "laws of large numbers" the difference between them would be practically negligible in the limit of large systems. Now it is true that there are many different entropy expressions that go into substantially the same thing in this limit; several examples were given by Gibbs. However, the Boltzmann expression is not one of them; as we prove in Sec. Ill , the difference is a direct measure of the effect of interparticle forces on the potential energy and pressure, and increases proportionally to the size of the system. Failure to recognize the fundamental role of the Gibbs H function is closely related to a much deeper confusion about entropy, probability, and irreversibility in general.As to the connections between entropy and information, in particular, "subjective human ignorance," Jaynes says, For example, the Boltzmann H theorem is almost universally equated to a demonstration of the second law of thermodynamics for dilute gases, while ever since the Ehrenfest criticisms, it has been claimed repeatedly that the Gibbs H cannot be related to the entropy because it is constant in time. Closer inspection reveals that the situation is very different. Merely to exhibit a mathematical quantity which tends to increase is not relevant to the second law unless one demonstrates that this quantity is related to the entropy as measured experimentally. But neither the Gibbs nor the Boltzmann H is so related for any distribution other than the equilibrium (i.e., canonical) one. Consequently, although Boltzmann's H theorem does show the tendency of a gas to go into a Maxwellian velocity distribution, this is not the same thing as the second law, which is a statement of experimental fact about the direction in which the observed macroscopic quantities (P,V,T) change. Past attempts to demonstrate the second law for systems other than dilute gases have generally tried to retain the basic idea of the Boltzmann H theorem. Since the Gibbs H is dynamically constant, one has resorted to some kind of coarse-graining operation, resulting in a new quantity Ħ, which tends to decrease. Such attempts cannot achieve their purpose, because (a) mathematically, the decrease in Ħ is due only to the artificial coarse-graining operation and it cannot, therefore have any physical significance; (b) as in the Boltzmann H theorem, the quantity whose increase is demonstrated is not the same thing as the entropy. For the fine-grained and coarse-grained probability distributions lead to just the same predictions for the observed macroscopic quantities, which alone determine the experimental entropy; the difference between H and Ħ is characteristic, not of the macroscopic state, but of the particular way in which we choose to coarse-grain. Any really satisfactory demonstration of the second law must therefore be based on a different approach than coarse-graining. Actually, a demonstration of the second law, in the rather specialized situation visualized in the aforementioned attempts, is much simpler than any H theorem. Once we accept the well-established proposition that the Gibbs canonical ensemble does yield the correct equilibrium thermodynamics, then there is logically no room for any assumption about which quantity represents entropy; it is a question of mathematically demonstrable fact. But as soon as we have understood the relation between Gibbs' H and the experimental entropy, Eq. (17) below, it is immediately obvious that the constancy of Gibbs' H, far from creating difficulties, is precisely the dynamical property we need for the proof. It is interesting that, although this field has long been regarded as one of the most puzzling and controversial parts of physics, the difficulties have not been mathematical. Each of the above assertions is proved below or in the Brandeis lectures, using only a few lines of elementary mathematics, all of which was given by Gibbs. It is the enormous conceptual difficulty of this field which has retarded progress for so long. Readers not familiar with recent developments may, I hope, be pleasantly surprised to see how clear and basically simple these problems have now become, in several respects. However, as we will see, there are still many complications and unsolved problems. Inspection of several statistical mechanics textbooks showed that, while most state the formal relations correctly, their full implications are never noted. Indeed, while all textbooks give extensive discussions of Boltzmann's H, some recent ones fail to mention even the existence of the Gibbs H. I was unable to find any explicit mathematical demonstration of their difference. It appeared, therefore, that the following note might be pedagogically useful.
The phase volume W0 therefore describes the full range of possible initial microstates; and not some arbitrary subset of them; this is the basic justification for using the canonical distribution to describe partial information. On the "subjective" side, we can therefore say that W0 measures our degree of ignorance as to the true unknown microstate, when the only information we have consists of the macroscopic thermodynamic parameters; a remark first made by Boltzmann.According to Jaynes (and Gibbs), information is conserved when macroscopic order disappears because it simply changes into microscopic (thus invisible) order as the path information of all the gas particles is preserved. As Boltzmann's mentor Joseph Loschmidt had argued in the early 1870's, if the velocities of all the particles could be reversed at an instant, the future evolution of the gas would move in the direction of decreasing entropy. All the original order would reappear. This is consistent with the idea of Pierre-Simon Laplace's super-intelligent demon and completely deterministic laws of nature. It also follows from the Louville theorem that the hyper-volume of a cloud of points in phase space is a constant as the system evolves. Classical mechanics and physical determinism was shown to be only an approximation for large numbers of particles shortly after Gibbs's death by Albert Einstein and the later "founders" of quantum mechanics. When quantum effects are included in the collision of gas particles, Boltzmann's idea of "molecular disorder" is seen to be correct and path information is destroyed. Nevertheless, Gibbs's idea of the conservation of information is still widely held today by mathematical physicists. And most texts on statistical mechanics still claim that microscopic collisions between particles are reversible. Some explicitly claim that quantum mechanics changes nothing, but that is because they limit themselves to the unitary (conservative and deterministic) evolution of the Schrödinger equation and ignore the collapse of the wave function. For example, Richard Tolman (p.8) claimed that the “principle of dynamical reversibility” holds also in quantum mechanics in appropriate form, indicating that quantum theory supplies no new kind of element for understanding the actual irreversibility in the macroscopic behavior of physical systems. This is because both classical and quantum statistical mechanics describe ensembles of systems. Such systems are in “mixed states,” disregarding the interference terms in the density matrix of the “pure states” density operator. This is the basis for decoherence theories. The origin of irreversibility depends on the ontological chance involved in von Neumann's Process 1, Dirac's projection postulate, the "collapse of the wave function," denied by so many interpretations of quantum mechanics and ignored in statistical mechanics texts. In her 2008 book, Carolyne Van Vliet (p.678) says that the theory of non-equilibrium statistical mechanics is incomplete without some kind of randomization at the microscopic level.
Ter Haar, D. 1995. Elements of Statistical Mechanics, Third Edition. Oxford: Butterworth-Heinemann. Tolman, Richard C. 2010. The Principles of Statistical Mechanics. New York: Dover Publications. Van Vliet, Carolyne M. 2008. Equilibrium and Non-Equilibrium Statistical Mechanics. Singapore ; Hackensack, NJ: World Scientific Publishing Company. Doyle, Robert O. 2014. "The Origin of Irreversibility".Microscopic physics is irreversible as a consequence of ontological indeterminacy. Jaynes likely did not accept the collapse of the quantum mechanical wave function. He was strongly influenced by Eugene Wigner, who was an early denier of the projection postulate and supporter of the unitary evolution of the universal wave function. He says
I have profited from discussions of these problems, over many years, with Professor E. P. Wigner, from whom I first heard the remark, "Entropy is an anthropomorphic concept."Jaynes' view (and Gibbs') is philosophical determinism, for which information about the universe at one time gives us the information at all times (Laplace's demon). Boltzmann, like Maxwell before him (and Exner and Schrödinger after him - at least initially) knew that determinism cannot be proven by any experimental results, which are necessarily only statistical and never certain. Jaynes is correct that there is a strong relationship between the physics of statistical mechanics and the mathematics of information theory.
ReferencesGibbs vs Boltzmann Entropies, American Journal of Physics 33, no. 5 (1965): 391-398. The Gibbs Paradox, in Maximum Entropy and Bayesian Methods, pp. 1-21. Springer Netherlands, 1992.