Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Daniel Boyd
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Nancy Cartwright
Gregg Caruso
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Austin Farrer
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Bas van Fraassen
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
Frank Jackson
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
Joseph Levine
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
Arthur O. Lovejoy
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
Tim Maudlin
James Martineau
Nicholas Maxwell
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

David Albert
Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Marcello Barbieri
Gregory Bateson
Horace Barlow
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Jean Bricmont
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Sadi Carnot
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Rudolf Clausius
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Bernard d'Espagnat
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
George F. R. Ellis
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Benjamin Gal-Or
Howard Gardner
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
James J. Gibson
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
Jeff Hawkins
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Basil Hiley
Art Hobson
Jesper Hoffmeyer
Don Howard
John H. Jackson
William Stanley Jevons
Roman Jakobson
E. T. Jaynes
Pascual Jordan
Eric Kandel
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Daniel Koshland
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
Karl Lashley
David Layzer
Joseph LeDoux
Gerald Lettvin
Gilbert Lewis
Benjamin Libet
David Lindley
Seth Lloyd
Hendrik Lorentz
Werner Loewenstein
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
Owen Maroney
David Marr
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
N. David Mermin
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Vernon Mountcastle
Emmy Noether
Donald Norman
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Wilder Penfield
Roger Penrose
Steven Pinker
Colin Pittendrigh
Walter Pitts
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Zenon Pylyshyn
Henry Quastler
Adolphe Quételet
Pasco Rakic
Nicolas Rashevsky
Lord Rayleigh
Frederick Reif
Jürgen Renn
Giacomo Rizzolati
Emil Roduner
Juan Roederer
Jerome Rothstein
David Ruelle
David Rumelhart
Tilman Sauer
Ferdinand de Saussure
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Sebastian Seung
Thomas Sebeok
Franco Selleri
Claude Shannon
Charles Sherrington
David Shiang
Abner Shimony
Herbert Simon
Dean Keith Simonton
Edmund Sinnott
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Teilhard de Chardin
Libb Thims
William Thomson (Kelvin)
Richard Tolman
Giulio Tononi
Peter Tse
Alan Turing
Francisco Varela
Vlatko Vedral
Mikhail Volkenstein
Heinz von Foerster
Richard von Mises
John von Neumann
Jakob von Uexküll
C. S. Unnikrishnan
C. H. Waddington
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
Herman Weyl
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Günther Witzany
Stephen Wolfram
H. Dieter Zeh
Semir Zeki
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
Leon Brillouin

In an important 1949 article entitled "Life, Thermodynamics, and Cybernetics," Brillouin was inspired by Norbert Wiener's new book Cybernetics and its connection of the new information theory with entropy and intelligence
One of the most interesting parts in Wiener's Cybernetics is the discussion on "Time series, information, and communication," in which he specifies that a certain "amount of information is the negative of the quantity usually defined as entropy in similar situations.'

This is a very remarkable point of view, and it opens the way for some important generalizations of the notion of entropy. Wiener introduces a precise mathematical definition of this new negative entropy for a certain number of problems of communication, and discusses the question of time prediction: when we possess a certain number of data about the behavior of a system in the past, how much can we predict of the behavior of that system in the future?

"Intelligence" was what Claude Shannon's early papers said was being transmitted, viz. knowledge, or "know-how."

In addition to these brilliant considerations, Wiener definitely indicates the need for an extension of the notion of entropy. "Information represents negative entropy"; but if we adopt this point of view, how can we avoid its extension to all types of intelligence? We certainly must be prepared to discuss the extension of entropy to scientific knowledge technical know-how, and all forms of intelligent thinking. Some examples may illustrate this new problem.

Compare Anthony Cashmore's view that a human being is just the same as a "bag of chemicals." Or Libb Thims' Hmolpedia idea that we are just "human molecules."

Take an issue of the New York Times, the book on Cybernetics, and an equal weight of scrap paper. Do they have the same entropy? According to the usual physical definition, the answer is "yes." But for an intelligent reader, the amount of information contained in the three bunches of paper is very different. If "information means negative entropy," as suggested by Wiener, how are we going to measure this new contribution to entropy? Wiener suggests some practical and numerical definitions that may apply to the simplest possible problem of this kind. This represents an entirely new field for investigation and a most revolutionary idea.

In his 1956 book Science and Information theory, Leon Brillouin coined the term "negentropy" for the negative entropy (a characteristic of free or available energy, as opposed to heat energy in equilibrium). He then connected it to information in what he called the "negentropy principle of information."

Brillouin described his principle as a generalization of Carnot's principle, that in the normal evolution of any system, the change in the entropy is greater than or equal to zero.

ΔS ≥ 0      (1)

Any increase in information ΔI must be compensated by an equal increase in entropy, so the more general form of equation 1 is:

Δ(S - I) ≥ 0      (2)

New information can only be obtained at the expense of the negentropy of some other system. The principal source of negentropy for terrestrial life is the sun, which acquired its low entropy state from the expanding universe followed by the collapse of material particles under the force of gravity.

Brillouin summarizes his ideas:

Acquisition of information about a physical system corresponds to a lower state of entropy for this system. Low entropy implies an unstable situation that will sooner or later follow its normal evolution toward stability and high entropy.

The second principle does not tell us anything about the time required, and hence we do not know how long the system will remember the information. But, if classical thermodynamics fails to answer this very important question, we can obtain the answer from a discussion of the molecular or atomic model, with the help of kinetic theory: the rate of attenuation of all sorts of waves, the rate of diffusion, the speed of chemical reactions, etc., can be computed from suitable models, and may vary from small fractions of a second to years or centuries.

These delays are used in all practical applications: it does not take very long for a system of pulses (representing dots and dashes, for instance) to be attenuated and forgotten, when sent along an electric cable, but this short time interval is long enough for transmission even over a long distance, and makes telecommunications possible.

A system capable of retaining information for some time can be used as a memory device in a computing machine. The examples discussed in the preceding section are not only interesting from a theoretical point of view, but they also show how to attack a practical problem. Let us consider, for instance, the problems of diffusion and spin distribution... The information stored in this system corresponds to a decrease in entropy. Our discussion shows how this situation is progressively destroyed by diffusion and collisions that increase the entropy and erase the information.

Entropy is usually described as measuring the amount of disorder in a physical system. A more precise statement is that entropy measures the lack of information about the actual structure of the system. This lack of information introduces the possibility of a great variety of microscopically distinct structures, which we are, in practice, unable to distinguish from one another. Since any one of these different microstructures can actually be realized at any given time, the lack of information corresponds to actual disorder in the hidden degrees of freedom.

This picture is clearly illustrated in the case of the ideal gas. When we specify the total number n of atoms, their mass m, their degeneracy factor g, and the total energy E..., we do not state the positions and velocities of each individual atom...Since we do not specify the positions and velocities of the atoms, we are unable to distinguish between two different samples of the gas, when the difference consists only in different positions and velocities for the atoms. Hence we can describe the situation as one of disordered atomic motion.

The origin of our modern ideas about entropy and information can be found in an old paper by Szilard5, who did the pioneer work but was not well understood at the time. The connection between entropy and information was rediscovered by Shannon6, but he defined entropy with a sign just opposite to that of the standard thermodynamical definition. Hence what Shannon calls entropy of information actually represents negentropy. This can be seen clearly in two examples (pages 27 and 61 of Shannon's book) where Shannon proves that in some irreversible processes (an irreversible transducer or a filter) his entropy of information is decreased. To obtain agreement with our conventions, reverse the sign and read negentropy.

The connection between entropy and information has been clearly discussed in some recent papers by Rothstein 7 in complete agreement with the point of view presented in this chapter.

Of course Brillouin should not be comparing information to classical thermodynamics, which has no concept of multiple possibilities with different probabilities, and the "logarithm of probabilities" that became entropy in the statistical mechanics of Boltzmann and Gibbs.

On Measurement Errors and Determinism

Brillouin emphasizes that experimental errors are inevitable and that it is unscientific to think of infinite accuracy in any measurement. Max Born, Ludwig Boltzmann, and even Isaac Newton knew this to be the case.

Brillouin says that this makes strict determinism impossible in scientific predictions. Laplace's demon can not acquire the infinite information needed to predict the future perfectly, just as Maxwell's demon cannot acquire the information needed to violate the second law, without destroying an equivalent amount of negentropy.

The natural evolution of any closed system involves a loss of information.

Mechanical laws are supposed to be reversible in time [This is said also of the unitary evolution of the Schrödinger equation in quantum mechanics], but this is true only if errors and experimental uncertainties are ignored.

The theory of information provides us with a possibility...to define the amount of information obtained from a certain experiment, and to measure it in a precise way. We only need to know the field of uncertainty - before and after the observation. The logarithm of the ratio of these two uncertainties yields the amount of information. If the final uncertainty is very small (very accurate measurement) the information obtained is very large.

The mathematician dreams of measurements of infinite accuracy, defining for instance the position of a point without any possible error. This would mean an experiment yielding an infinite amount of information and this is physically impossible. One of the most important results of the theory is known as the "negentropy principle of information." It states that any information obtained from an experiment must be paid for in negentropy.

A very large amount of information shall cost a very high price, in negentropy. An infinite amount of information is unattainable. An infinitely short distance cannot be measured, and a physical continuum in space and time is impossible to define physically.

The role of experimental errors has been known for a very long time and was recognized by all scientists; but it was usually considered as a secondary effect, a source of nuisance that could be neglected in most occasions and should be ignored by the theory. The assumption was that errors could be made "as small as might be desired," by careful instrumentation, and played no essential role. This was the point of view of mathematicians discussing the axioms of geometry, and most physicists accepted, implicitly or explicitly, this kind of idealization. Modern physics had to get rid of these unrealistic schemes, and it was indispensable to recognize the fundamental importance of errors, together with the unpleasant fact that they cannot be made "as small as desired" and must be included in the theory.

The first instance was found in connection with statistical thermodynamics, but it was usually toned down and led to many (in our opinion often meaningless) discussions such as: how is it possible to obtain irreversible thermodynamics from strictly reversible mechanical laws ? We shall come back to this problem when discussing the exact meaning of determinism and show that it corresponds to a metaphysical creed, not to a physical law.

With Heisenberg's uncertainty principle, the fundamental role of experimental errors became a basic feature of physics. An additional law was stated in Chapters 12 and 16 called the "negentropy principle of information." It states that an observation yields a certain amount of information ΔI, and that this information can be quantitatively measured and compared with the entropy increase ΔS during the experimental measurement. The net result is (in entropy units)

ΔSΔI      or      ΔI + ΔN ≤ 0

with

ΔN = -ΔS ≤ 0

neg(ative) entropy.

The Problem of Determinism

The laws of classical mechanics represent a mathematical idealization and should not be assumed to correspond to the real laws of nature. In many problems (astronomy, for instance) they yield wonderful results that agree with observation within experimental errors. In other fields they had to be amended (relativity, quantum mechanics). The classical viewpoint was to ignore the actual role and importance of experimental errors. Errors were assumed to be accidental; hence, it was always imagined that they could be made as small as one wished and finally ignored. This oversimplified picture led to the assumption of complete determinism in classical mechanics. We now have to realize that experimental errors are inevitable, a discovery that makes strict determinism impossible. Errors are an essential part of the world's picture and must be included in the theory.

Causality must be replaced by statistical probabilities; a scientist may or may not believe in determinism. It is a matter of faith, and belongs to metaphysics. Physical discussions are unable to prove or to disprove it. This general viewpoint may be called the "matter of fact" position.

M. Born states very clearly the situation. He quotes Einstein as saying that before quantum mechanics, it was assumed that "everything was to be reduced to objects situated in space-time, and to strict relations between these objects ... Nothing appeared to refer to our empirical knowledge about these objects... This is what was meant by a physical description of a real external world." This position appears as untenable in modern physics. We have no way to prove the existence of such a real external world, and it is very dangerous to speak of something we cannot observe. If we restrain our thinking to observable facts, we can only speak of possible relations between a certain experiment and another one, but we should never discuss what happens while we are not making any observation; we must candidly admit that we do not know (no more than we know what happens on the other side of the moon). The position defined in this way is taken by M. Born and agrees with the philosophy of science stated by the Vienna school.

Is such a viewpoint accepted by all physicists? The answer is far from clear. Pure mathematicians have great difficulty in agreeing with this inclusion of errors within the theory, and many theoretical physicists are still mathematicians at heart. The uncertainty relations of Bohr and Heisenberg are based upon the kind of thinking we tried to define. But when one looks at the further expansion of quantum theories, one is amazed at the many fancy visualizations describing physics in terms of unobservable entities. The language of physicists is loaded with a jargon understandable only to specialists; special names have been coined for terms in a series of approximations, as if each isolated term had a meaning (exchange terms, pair creation, virtual creation, and absorption of particles, etc.). Actually, only the final sum matters. Wise men know where and how to use these figures of language, and they are aware of their complete lack of reality. They realize that the jargon represents no more than an artificial way of describing complicated equations; but many physicists may be misled by such methods, which are really dangerous. In brief, quantum theory pays lip service to the sound principle of matter-of-fact descriptions, but soon forgets about it and uses a very careless language.

Besides mathematicians and quantum theoreticians, many scientists feel very reluctant to face the situation described above and to abandon old-fashioned ideas. They still believe in a real physical world following its own unperturbed evolution, whether we observe it or not. In order to reconcile this view with recent physical discoveries, they have to invent the existence of a number of "hidden variables" that we are unable to observe at present. In our opinion these hidden variables may do more harm than good. If we cannot observe them, let us admit that they have no reality and may exist only in the imagination of their authors. This is not meant to be a sarcasm. Imagination is absolutely needed in scientific research, and many important discoveries were, at the beginning, pure works of imagination ; they became important only later when experimental proof was obtained and checked with results predicted by pure imagination. Finally, the new experimental discoveries became the scientific basis for the part that had been verified by experiment.

Borel and the gram of matter on Sirius
In his 1964 book, Scientific Uncertainty, and Information, Brillouin cited Émile Borel (Introduction géométrique a quelques théories physiques, 1914, p.94) as explaining how an external disturbance could randomize the motions of molecules in a terrestrial gas.
C. It is impossible to study the properties of a single (mathematical) trajectory. The physicist knows only bundles of trajectories, corresponding to slightly different initial conditions.
Note that it is Brillouin, not Borel, who suggests Sirius
Borel, for instance, computed that a displacement of 1 cm, on a mass of 1 gram, located somewhere in a not too distant star (say, Sirius) would change the gravitational field on the earth by a fraction 10-100. The present author went further and proved that any information obtained from an experiment must be paid for by a corresponding increase of entropy in the measuring device: infinite accuracy would cost an infinite amount of entropy increase and require infinite energy! This is absolutely unthinkable.

D. Let us simplify the problem, and assume that the laws of mechanics are rigorous, while experimental errors appear only in the determination of initial conditions. ln the bundle of trajectories defined by these conditions, some may be "nondegenerate" while others may "degenerate." The bundle may soon explode, be divided into a variety of smaller bundles forging ahead in different directions. This is the case for a model corresponding to the kinetic theory of gases. Borel computes that errors of 10-100 on initial conditions will enable one to predict molecular collisions for a split second and no more. It is not only "very difficult," but actually impossible to predict exactly the future behavior of such a model. The present considerations lead directly to Boltzmann's statistical mechanics and the so-called "ergodic" theorem.

For Teachers
For Scholars

Chapter 1.5 - The Philosophers Chapter 2.1 - The Problem of Knowledge
Home Part Two - Knowledge
Normal | Teacher | Scholar