Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Daniel Boyd
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Nancy Cartwright
Gregg Caruso
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Austin Farrer
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Bas van Fraassen
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
Frank Jackson
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
Joseph Levine
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
Arthur O. Lovejoy
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
Tim Maudlin
James Martineau
Nicholas Maxwell
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

David Albert
Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Marcello Barbieri
Gregory Bateson
Horace Barlow
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Jean Bricmont
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Sadi Carnot
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Rudolf Clausius
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Bernard d'Espagnat
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
George F. R. Ellis
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Benjamin Gal-Or
Howard Gardner
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
James J. Gibson
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
Jeff Hawkins
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Basil Hiley
Art Hobson
Jesper Hoffmeyer
Don Howard
John H. Jackson
William Stanley Jevons
Roman Jakobson
E. T. Jaynes
Pascual Jordan
Eric Kandel
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Daniel Koshland
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
Karl Lashley
David Layzer
Joseph LeDoux
Gerald Lettvin
Gilbert Lewis
Benjamin Libet
David Lindley
Seth Lloyd
Hendrik Lorentz
Werner Loewenstein
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
Owen Maroney
David Marr
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
N. David Mermin
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Vernon Mountcastle
Emmy Noether
Donald Norman
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Wilder Penfield
Roger Penrose
Steven Pinker
Colin Pittendrigh
Walter Pitts
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Zenon Pylyshyn
Henry Quastler
Adolphe Quételet
Pasco Rakic
Nicolas Rashevsky
Lord Rayleigh
Frederick Reif
Jürgen Renn
Giacomo Rizzolati
Emil Roduner
Juan Roederer
Jerome Rothstein
David Ruelle
David Rumelhart
Tilman Sauer
Ferdinand de Saussure
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Sebastian Seung
Thomas Sebeok
Franco Selleri
Claude Shannon
Charles Sherrington
David Shiang
Abner Shimony
Herbert Simon
Dean Keith Simonton
Edmund Sinnott
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Teilhard de Chardin
Libb Thims
William Thomson (Kelvin)
Richard Tolman
Giulio Tononi
Peter Tse
Alan Turing
Francisco Varela
Vlatko Vedral
Mikhail Volkenstein
Heinz von Foerster
Richard von Mises
John von Neumann
Jakob von Uexküll
C. S. Unnikrishnan
C. H. Waddington
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
Herman Weyl
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Günther Witzany
Stephen Wolfram
H. Dieter Zeh
Semir Zeki
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
Entropy, Information, and Probability

For over sixty years, since I first read Arthur Stanley Eddington's The Nature of the Physical World, I have been struggling, as the "information philosopher," to understand and to find simpler ways to explain the concept of entropy.

Even more challenging has been to find the best way to teach the mathematical and philosophical connections between entropy and information. A great deal of the literature complains that these concepts are too difficult to understand and may have nothing to do with one another.

Finally, there is the important concept of probability, with its implication of possibilities, one or more of which may become an actuality.

Determinist philosophers (perhaps a majority) and scientists (a strong minority) say that we use probability and statistics only because our finite minds are incapable of understanding reality. Their idea of the universe is that it contains infinite information which only an infinite mind can comprehend. Our observable universe contains a very large but finite amount of information. Entropy is a measure of that lost energy.

A very strong connection between entropy and probability is obvious because Ludwig Boltzmann's formula for entropy S = logW, where W stands for Wahrscheinlichkeit, the German for probability. We believe that Rudolf Clausius, who first defined and named entropy, gave it the symbol S in honor of Sadi Carnot, whose study of heat engine efficiency showed that some fraction of available energy is always wasted or dissipated, only a part can be converted to mechanical work.

is mathematically identical to Claude Shannon's expression for information I, but with a minus sign and different dimensions.

Boltzmann entropy: S = k ∑ pi ln pi.        Shannon information: I = - ∑ pi ln pi.

Boltzmann entropy and Shannon entropy have different dimensions (S = joules/°K,
I = dimensionless "bits"), but they share the "mathematical isomorphism" of a logarithm of probabilities, which is the basis of both Boltzmann's and Gibbs' statistical mechanics..

The first entropy is material, the latter mathematical - indeed purely immaterial information.

But they have deeply important connections which information philosophy must sort out and explain.

First, both Boltzmann and Shannon expressions contain probabilities and statistics. Many philosophers and scientists deny any ontological indeterminism, such as the chance in quantum mechanics discovered by Albert Einstein in 1916. They may accept an epistemological uncertainty, as proposed by Werner Heisenberg in 1927.

Today many thinkers propose chaos and complexity theories (both theories are completely deterministic) as adequate explanations, while they deny ontological chance. Ontological chance is the basis for creating any information structure. It explains the variation in species needed for Darwinian evolution. It underlies human freedom and the creation of new ideas.

In statistical mechanics, the summation ∑ is over all the possible distributions of gas particles in a container. If the number of distributions is W , and the probability of all distributions is the same, the pi are all equal to 1/W and entropy is maximal: S = k ∑ 1/W ln 1/W, so S = k ln W.

In the communication of information, W is the number of possible messages. If the probability of all messages is the same, pi are identical, I = - lnW. If there are N possible messages, then N bits of information are communicated by receiving one of them.

On the other hand, if there is only one possible message, its probability is unity, and the information content is 1 ln 0 = zero.

If there is only one possible message, no new information is communicated. This is the case in a deterministic universe, where past events completely cause present events. The information in a deterministic universe is a constant of nature. Religions that include an omniscient god often believe all that information is in God's mind.

Note that if there are no alternative possibilities in messages, Shannon (following his Bell Labs colleague Ralph Hartley) says there can be no new information. We conclude that the creation of new information structures in the universe is only possible because the universe is in fact indeterministic and our futures are open and free.

Thermodynamic entropy involves matter and energy, Shannon entropy is entirely mathematical, on one level purely immaterial information, though it cannot exist without "negative" thermodynamic entropy.

It is true that information is neither matter nor energy, which are conserved constants of nature (the first law of thermodynamics). But information needs matter to be embodied in an "information structure." And it needs ("free") energy to be communicated over Shannon's information channels.

Boltzmann entropy is intrinsically related to "negative entropy." Without pockets of negative entropy in the universe (and out-of-equilibrium free-energy flows), there would no "information structures" anywhere.

Pockets of negative entropy are involved in the creation of everything interesting in the universe. It is a cosmic creation process without a creator.

Visualizing Information

There is a mistaken idea in statistical physics that any particular distribution or arrangement of material particles has exactly the same information content as any other distribution. This is an anachronism from nineteenth-century deterministic statistical physics.

Hemoglobin Diffusing Completely Mixed Gas

If we measure the positions in phase space of all the atoms in a hemoglobin protein, we get a certain number of bits of data (the x, y, z, vx, vy, vz values). If the chemical bonds are all broken allowing atoms to diffuse, or the atoms are completely randomized into an equilibrium gas with maximum entropy, we get different values, but the same amount of data. Does this mean that any particular distribution has exactly the same information?

This led many statistical physicists to claim that information in a gas is the same wherever the particles are, Macroscopic information is not lost, it just becomes microscopic information that can be completely recovered if the motions of every particle could be reversed. Reversibility allows all the gas particles to go back inside the bottle.

But the information in the hemoglobin is much higher and the disorder (entropy) near zero. A human being is not just a "bag of chemicals," despite plant biologist Anthony Cashmore. Each atom in hemoglobin is not merely in some volume limited by the uncertainty principle ℏ3, it is in a specific quantum cooperative relationship with its neighbors that support its biological function. These precise positional relationships make a passive linear protein into a properly folded active enzyme. Breaking all these quantum chemical bonds destroys life.

Where an information structure is present, the entropy is low and Gibbs free energy is high.

When gas particles can go anywhere in a container, the number of possible distributions is enormous and entropy is maximal. When atoms are bound to others in the hemoglobin structure, the number of possible distributions is essentially 1, and the logarithm of 1 is 0!

Even more important, the parts of every living thing are communicating information - signaling - to other parts, near and far, as well as to other living things. Information communication allows each living thing to maintain itself in a state of homeostasis, balancing all matter and energy entering and leaving, maintaining all vital functions. Statistical physics and chemical thermodynamics know nothing of this biological information.

Normal | Teacher | Scholar