Citation for this page in APA citation style.           Close


Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Louise Antony
Thomas Aquinas
David Armstrong
Harald Atmanspacher
Robert Audi
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Michael Burke
Joseph Keim Campbell
Rudolf Carnap
Ernst Cassirer
David Chalmers
Roderick Chisholm
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Herbert Feigl
John Martin Fischer
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
Sam Harris
William Hasker
Georg W.F. Hegel
Martin Heidegger
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Andrea Lavazza
Keith Lehrer
Gottfried Leibniz
Michael Levin
George Henry Lewes
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
E. Jonathan Lowe
John R. Lucas
Alasdair MacIntyre
Ruth Barcan Marcus
James Martineau
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
Thomas Nagel
Friedrich Nietzsche
John Norton
Robert Nozick
William of Ockham
Timothy O'Connor
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Karl Popper
Huw Price
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
G.H. von Wright
David Foster Wallace
R. Jay Wallace
Ted Warfield
Roy Weatherford
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf


Michael Arbib
Bernard Baars
Gregory Bateson
John S. Bell
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Donald Campbell
Anthony Cashmore
Eric Chaisson
Jean-Pierre Changeux
Arthur Holly Compton
John Conway
John Cramer
E. P. Culverwell
Charles Darwin
Terrence Deacon
Lüder Deecke
Louis de Broglie
Max Delbrück
Abraham de Moivre
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Paul Ehrenfest
Albert Einstein
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
Joseph Fourier
Lila Gatlin
Michael Gazzaniga
GianCarlo Ghirardi
J. Willard Gibbs
Nicolas Gisin
Paul Glimcher
Thomas Gold
Brian Goodwin
Joshua Greene
Jacques Hadamard
Patrick Haggard
Stuart Hameroff
Augustin Hamon
Sam Harris
Hyman Hartman
John-Dylan Haynes
Martin Heisenberg
Donald Hebb
Werner Heisenberg
John Herschel
Jesper Hoffmeyer
E. T. Jaynes
William Stanley Jevons
Roman Jakobson
Pascual Jordan
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Ladislav Kovàč
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
David Layzer
Benjamin Libet
Seth Lloyd
Hendrik Lorentz
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
James Clerk Maxwell
Ernst Mayr
John McCarthy
Ulrich Mohrhoff
Jacques Monod
Emmy Noether
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Roger Penrose
Steven Pinker
Colin Pittendrigh
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Adolphe Quételet
Juan Roederer
Jerome Rothstein
David Ruelle
Erwin Schrödinger
Aaron Schurger
Claude Shannon
David Shiang
Herbert Simon
Dean Keith Simonton
B. F. Skinner
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
William Thomson (Kelvin)
Peter Tse
Vlatko Vedral
Heinz von Foerster
John von Neumann
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
H. Dieter Zeh
Ernst Zermelo
Wojciech Zurek


Free Will
Mental Causation
James Symposium
Leon Brillouin

In an important 1949 article entitled "Life, Thermodynamics, and Cybernetics," Brillouin was inspired by Norbert Wiener's new book Cybernetics and its connection of the new information theory with entropy and intelligence
One of the most interesting parts in Wiener's Cybernetics is the discussion on "Time series, information, and communication," in which he specifies that a certain "amount of information is the negative of the quantity usually defined as entropy in similar situations.'

This is a very remarkable point of view, and it opens the way for some important generalizations of the notion of entropy. Wiener introduces a precise mathematical definition of this new negative entropy for a certain number of problems of communication, and discusses the question of time prediction: when we possess a certain number of data about the behavior of a system in the past, how much can we predict of the behavior of that system in the future?

In addition to these brilliant considerations, Wiener definitely indicates the need for an extension of the notion of entropy. "Information represents negative entropy"; but if we adopt this point of view, how can we avoid its extension to all types of intelligence? We certainly must be prepared to discuss the extension of entropy to scientific knowledge technical know-how, and all forms of intelligent thinking. Some examples may illustrate this new problem.

Take an issue of the New York Times, the book on Cybernetics, and an equal weight of scrap paper. Do they have the same entropy? According to the usual physical definition, the answer is "yes." But for an intelligent reader, the amount of information contained in the three bunches of paper is very different. If "information means negative entropy," as suggested by Wiener, how are we going to measure this new contribution to entropy? Wiener suggests some practical and numerical definitions that may apply to the simplest possible problem of this kind. This represents an entirely new field for investigation and a most revolutionary idea.

In his 1956 book Science and Information theory, Leon Brillouin coined the term "negentropy" for the negative entropy (a characteristic of free or available energy, as opposed to heat energy in equilibrium). He then connected it to information in what he called the "negentropy principle of information."

Brillouin described his principle as a generalization of Carnot's principle, that in the normal evolution of any system, the change in the entropy is greater than or equal to zero.

ΔS ≥ 0      (1)

Any increase in information ΔI must be compensated by an equal increase in entropy, so the more general form of equation 1 is:

Δ(S - I) ≥ 0      (2)

New information can only be obtained at the expense of the negentropy of some other system. The principal source of negentropy for terrestrial life is the sun, which acquired its low entropy state from the expanding universe followed by the collapse of material particles under the force of gravity.

Brillouin summarizes his ideas:

Acquisition of information about a physical system corresponds to a lower state of entropy for this system. Low entropy implies an unstable situation that will sooner or later follow its normal evolution toward stability and high entropy.

The second principle does not tell us anything about the time required, and hence we do not know how long the system will remember the information. But, if classical thermodynamics fails to answer this very important question, we can obtain the answer from a discussion of the molecular or atomic model, with the help of kinetic theory: the rate of attenuation of all sorts of waves, the rate of diffusion, the speed of chemical reactions, etc., can be computed from suitable models, and may vary from small fractions of a second to years or centuries.

These delays are used in all practical applications: it does not take very long for a system of pulses (representing dots and dashes, for instance) to be attenuated and forgotten, when sent along an electric cable, but this short time interval is long enough for transmission even over a long distance, and makes telecommunications possible.

A system capable of retaining information for some time can be used as a memory device in a computing machine. The examples discussed in the preceding section are not only interesting from a theoretical point of view, but they also show how to attack a practical problem. Let us consider, for instance, the problems of diffusion and spin distribution... The information stored in this system corresponds to a decrease in entropy. Our discussion shows how this situation is progressively destroyed by diffusion and collisions that increase the entropy and erase the information.

Entropy is usually described as measuring the amount of disorder in a physical system. A more precise statement is that entropy measures the lack of information about the actual structure of the system. This lack of information introduces the possibility of a great variety of microscopically distinct structures, which we are, in practice, unable to distinguish from one another. Since any one of these different microstructures can actually be realized at any given time, the lack of information corresponds to actual disorder in the hidden degrees of freedom.

This picture is clearly illustrated in the case of the ideal gas. When we specify the total number n of atoms, their mass m, their degeneracy factor g, and the total energy E..., we do not state the positions and velocities of each individual atom...Since we do not specify the positions and velocities of the atoms, we are unable to distinguish between two different samples of the gas, when the difference consists only in different positions and velocities for the atoms. Hence we can describe the situation as one of disordered atomic motion.

The origin of our modern ideas about entropy and information can be found in an old paper by Szilard5, who did the pioneer work but was not well understood at the time. The connection between entropy and information was rediscovered by Shannon6, but he defined entropy with a sign just opposite to that of the standard thermodynamical definition. Hence what Shannon calls entropy of information actually represents negentropy. This can be seen clearly in two examples (pages 27 and 61 of Shannon's book) where Shannon proves that in some irreversible processes (an irreversible transducer or a filter) his entropy of information is decreased. To obtain agreement with our conventions, reverse the sign and read negentropy.

The connection between entropy and information has been clearly discussed in some recent papers by Rothstein 7 in complete agreement with the point of view presented in this chapter.

On Measurement Errors and Determinism

Brillouin emphasizes that experimental errors are inevitable and that it is unscientific to think of infinite accuracy in any measurement. Max Born, Ludwig Boltzmann, and even Isaac Newton knew this to be the case.

Brillouin says that this makes strict determinism impossible in scientific predictions. Laplace's demon can not acquire the infinite information needed to predict the future perfectly, just as Maxwell's demon cannot acquire the information needed to violate the second law, without destroying an equivalent amount of negentropy.

The natural evolution of any closed system involves a loss of information.

Mechanical laws are supposed to be reversible in time [This is said also of the unitary evolution of the Schrödinger equation in quantum mechanics], but this is true only if errors and experimental uncertainties are ignored.

The theory of information provides us with a define the amount of information obtained from a certain experiment, and to measure it in a precise way. We only need to know the field of uncertainty - before and after the observation. The logarithm of the ratio of these two uncertainties yields the amount of information. If the final uncertainty is very small (very accurate measurement) the information obtained is very large.

The mathematician dreams of measurements of infinite accuracy, defining for instance the position of a point without any possible error. This would mean an experiment yielding an infinite amount of information and this is physically impossible. One of the most important results of the theory is known as the "negentropy principle of information." It states that any information obtained from an experiment must be paid for in negentropy.

A very large amount of information shall cost a very high price, in negentropy. An infinite amount of information is unattainable. An infinitely short distance cannot be measured, and a physical continuum in space and time is impossible to define physically.

The role of experimental errors has been known for a very long time and was recognized by all scientists; but it was usually considered as a secondary effect, a source of nuisance that could be neglected in most occasions and should be ignored by the theory. The assumption was that errors could be made "as small as might be desired," by careful instrumentation, and played no essential role. This was the point of view of mathematicians discussing the axioms of geometry, and most physicists accepted, implicitly or explicitly, this kind of idealization. Modern physics had to get rid of these unrealistic schemes, and it was indispensable to recognize the fundamental importance of errors, together with the unpleasant fact that they cannot be made "as small as desired" and must be included in the theory.

The first instance was found in connection with statistical thermodynamics, but it was usually toned down and led to many (in our opinion often meaningless) discussions such as: how is it possible to obtain irreversible thermodynamics from strictly reversible mechanical laws ? We shall come back to this problem when discussing the exact meaning of determinism and show that it corresponds to a metaphysical creed, not to a physical law.

With Heisenberg's uncertainty principle, the fundamental role of experimental errors became a basic feature of physics. An additional law was stated in Chapters 12 and 16 called the "negentropy principle of information." It states that an observation yields a certain amount of information ΔI, and that this information can be quantitatively measured and compared with the entropy increase ΔS during the experimental measurement. The net result is (in entropy units)

ΔSΔI      or      ΔI + ΔN ≤ 0


ΔN = -ΔS ≤ 0

neg(ative) entropy.

The Problem of Determinism

The laws of classical mechanics represent a mathematical idealization and should not be assumed to correspond to the real laws of nature. In many problems (astronomy, for instance) they yield wonderful results that agree with observation within experimental errors. In other fields they had to be amended (relativity, quantum mechanics). The classical viewpoint was to ignore the actual role and importance of experimental errors. Errors were assumed to be accidental; hence, it was always imagined that they could be made as small as one wished and finally ignored. This oversimplified picture led to the assumption of complete determinism in classical mechanics. We now have to realize that experimental errors are inevitable, a discovery that makes strict determinism impossible. Errors are an essential part of the world's picture and must be included in the theory.

Causality must be replaced by statistical probabilities; a scientist may or may not believe in determinism. It is a matter of faith, and belongs to metaphysics. Physical discussions are unable to prove or to disprove it. This general viewpoint may be called the "matter of fact" position.

M. Born states very clearly the situation. He quotes Einstein as saying that before quantum mechanics, it was assumed that "everything was to be reduced to objects situated in space-time, and to strict relations between these objects ... Nothing appeared to refer to our empirical knowledge about these objects... This is what was meant by a physical description of a real external world." This position appears as untenable in modern physics. We have no way to prove the existence of such a real external world, and it is very dangerous to speak of something we cannot observe. If we restrain our thinking to observable facts, we can only speak of possible relations between a certain experiment and another one, but we should never discuss what happens while we are not making any observation; we must candidly admit that we do not know (no more than we know what happens on the other side of the moon). The position defined in this way is taken by M. Born and agrees with the philosophy of science stated by the Vienna school.

Is such a viewpoint accepted by all physicists? The answer is far from clear. Pure mathematicians have great difficulty in agreeing with this inclusion of errors within the theory, and many theoretical physicists are still mathematicians at heart. The uncertainty relations of Bohr and Heisenberg are based upon the kind of thinking we tried to define. But when one looks at the further expansion of quantum theories, one is amazed at the many fancy visualizations describing physics in terms of unobservable entities. The language of physicists is loaded with a jargon understandable only to specialists; special names have been coined for terms in a series of approximations, as if each isolated term had a meaning (exchange terms, pair creation, virtual creation, and absorption of particles, etc.). Actually, only the final sum matters. Wise men know where and how to use these figures of language, and they are aware of their complete lack of reality. They realize that the jargon represents no more than an artificial way of describing complicated equations; but many physicists may be misled by such methods, which are really dangerous. In brief, quantum theory pays lip service to the sound principle of matter-of-fact descriptions, but soon forgets about it and uses a very careless language.

Besides mathematicians and quantum theoreticians, many scientists feel very reluctant to face the situation described above and to abandon old-fashioned ideas. They still believe in a real physical world following its own unperturbed evolution, whether we observe it or not. In order to reconcile this view with recent physical discoveries, they have to invent the existence of a number of "hidden variables" that we are unable to observe at present. In our opinion these hidden variables may do more harm than good. If we cannot observe them, let us admit that they have no reality and may exist only in the imagination of their authors. This is not meant to be a sarcasm. Imagination is absolutely needed in scientific research, and many important discoveries were, at the beginning, pure works of imagination ; they became important only later when experimental proof was obtained and checked with results predicted by pure imagination. Finally, the new experimental discoveries became the scientific basis for the part that had been verified by experiment.

Borel and the gram of matter on Sirius
In his 1964 book, Scientific Uncertainty, and Information, Brillouin cited Emile Borel (Introduction géométrique a quelques théories physiques, 1914, p.94) as explaining how an external disturbance could randomize the motions of molecules in a terrestrial gas.
C. It is impossible to study the properties of a single (mathematical) trajectory. The physicist knows only bundles of trajectories, corresponding to slightly different initial conditions.
Note that it is Brillouin, not Borel, who suggests Sirius
Borel, for instance, computed that a displacement of 1 cm, on a mass of 1 gram, located somewhere in a not too distant star (say, Sirius) would change the gravitational field on the earth by a fraction 10-100. The present author went further and proved that any information obtained from an experiment must be paid for by a corresponding increase of entropy in the measuring device: infinite accuracy would cost an infinite amount of entropy increase and require infinite energy! This is absolutely unthinkable.

D. Let us simplify the problem, and assume that the laws of mechanics are rigorous, while experimental errors appear only in the determination of initial conditions. ln the bundle of trajectories defined by these conditions, some may be "nondegenerate" while others may "degenerate." The bundle may soon explode, be divided into a variety of smaller bundles forging ahead in different directions. This is the case for a model corresponding to the kinetic theory of gases. Borel computes that errors of 10-100 on initial conditions will enable one to predict molecular collisions for a split second and no more. It is not only "very difficult," but actually impossible to predict exactly the future behavior of such a model. The present considerations lead directly to Boltzmann's statistical mechanics and the so-called "ergodic" theorem.

For Teachers
For Scholars

Normal | Teacher | Scholar