Dirk ter Haar
(1919-2002)
Dirk ter Haar was a Dutch physicist who studied physics in Scotland (St. Andrews) and England (Oxford).,
During the last years of World War II he attended
Hendrik Kramers' famous lectures on statistical mechanics, which led to the great school of Dutch statistical physicists (F. J. Belinfante, Max Dresden, Nico van Kampen, Abraham Pais).
After the war he went to study at
Niels Bohr's Institute in Copenhagen, then went back to Leiden to receive the Ph.D. under Hendrik Kramers,
His academic career was spent largely at the University of Oxford. One of his students was the historian of statistical physics Steven Brush.
He wrote a number of important books on statistical mechanics in the 1950's and one on the Old Quantum Theory in 1967. Perhaps his most famous book was the 1954
Elements of Statistical Mechanics, which ter Haar said was largely a new version of of the Kramers lectures in 1944-45.
ter Haar considered the implications of quantum mechanics for statistical mechanics. Many statistical physicists argued that quantum mechanics requires no changes to the conclusions of classical thermodynamics and statistical mechanics. These thinkers tended to be
determinists who were uncomfortable with
Werner Heisenberg's claim that quantum mechanics had eliminated
causality in physics.
ter Haar also challenged the applicability of
Claude Shannon's information theory and
Norbert Wiener's cybernetics to statistical mechanics. But he accepted
Leo Szilard's analysis of Maxwell's demon for a single particle, showing how a measurement of a single particle involves
k ln
2 of entropy, or one bit of information, and
Leon Brillouin's analysis, where he coined the term "negentropy."
He wrote...
The relationship was introduced because Boltzmann's formula for entropy is identical to Shannon's formula with a minus sign.
S = k ∑ pi ln pi.
If all pi are identical,S = k ln W.
Information is neither matter nor energy, but where an information structure is present, entropy is low and Gibbs free energy is high.
The relationship between entropy and lack of information has led many authors, notably Shannon, to introduce “entropy” as a measure for the information transmitted by cables and so on, and in this way entropy has figured largely in recent discussions in information theory.
It must be stressed here that the entropy introduced in information theory is not a thermodynamical quantity and that the use of the same term is rather misleading. It was probably introduced because of a rather loose use of the term “information.”
In this connection we may briefly discuss Maxwell’s demon. Maxwell introduced in 1871 his famous demon, “a being whose faculties are so sharpened that he can follow every molecule in its course, and would be able to do what is at present impossible to us. . . . Let us suppose that a vessel is divided into two portions A and B by a division in which there is a small hole, and that a being who can see the individual molecules opens and closes this hole, so as to allow only the swifter molecules to pass from A to B, and only the slower ones to pass from B to A. He will, thus, without expenditure of work raise the temperature of B and lower that of A, in contradiction to the second law of thermodynamics.”
Maxwell’s demon has been widely discussed and various authors have set out to show that various attempts to circumvent the second law by using the demon are bound to fail. Although their discussions differ in some respects they have a few points in common. The first point is the observation that one should take the demon to be part of the total system and then one must consider the total entropy of the original system and the demon. The second point which was most clearly developed for the first time by Szilard is that the demon, in order to be able to operate the trapdoor through which the molecules pass, must receive information. Its own entropy increases therefore and it is now the question whether the increase of the demon’s entropy is smaller or larger than the decrease of the entropy of the gas. Both Szilard and Brillouin consider possible arrangements and show that in those cases the net change of entropy is positive. Szilard analyzes the problem very thoroughly and shows that one can describe a generalized Maxwell’s demon as follows. By some means an operation on a system is determined by the result of a measurement on the system which immediately precedes the operation. In Maxwell’s original scheme the operation was the opening of the trapdoor and the measurement was the determination of the velocity of an approaching molecule. The result of the operation will be a decrease of entropy, but the preceding measurement will be accompanied by an increase in entropy, and once again one must consider the balance.
Wiener takes a simpler point of view.; He considers the situation, where the demon acts, as a metastable state and writes: “In the long run, the Maxwell demon is itself subject to a random motion corresponding to the temperature of its environment and it receives a large number of small impressions until it falls into ‘a certain vertigo’ and is incapable of clear perceptions. In fact, it ceases to act as a Maxwell demon.”
This point of view is probably too simplified and we prefer that of Szilard’s and refer the reader to his paper for a more extensive discussion.
Elements of Statistical Mechanics, pp. 161-162
We can note that in the most widely used modern textbook on statistical physics, Fred Reif says,
The quantity —In Γ, i.e., the function Σ Pr In Pr, can be used as a measure
of nonrandomness, or information, available about systems in the ensemble.
This function plays a key role as a measure of information in problems of communication and general “information theory.”*
* See, for example, L. Brillouin, “Science and Information Theory,” 2d ed., Academic Press, New York, 1962; or J. R. Pierce, “Symbols, Signals, and Noise,” Harper, New York, 1961. Statistical mechanics is considered from the point of view of information theory by E. T. Jaynes in Phys. Rev., vol. 106, p. 620 (1957).
Fundamentals of Statistical and Thermal Physics, p.231
Normal |
Teacher |
Scholar