Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Daniel Boyd
F.H.Bradley
C.D.Broad
Michael Burke
Jeremy Butterfield
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Nancy Cartwright
Gregg Caruso
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Tom Clark
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Austin Farrer
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Bas van Fraassen
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
Frank Jackson
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
Joseph Levine
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
Arthur O. Lovejoy
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
Tim Maudlin
James Martineau
Nicholas Maxwell
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
U.T.Place
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
John Duns Scotus
Arthur Schopenhauer
John Searle
Wilfrid Sellars
David Shiang
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
Peter Slezak
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

David Albert
Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Marcello Barbieri
Gregory Bateson
Horace Barlow
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Jean Bricmont
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Sadi Carnot
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Rudolf Clausius
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Bernard d'Espagnat
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
George F. R. Ellis
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Augustin-Jean Fresnel
Benjamin Gal-Or
Howard Gardner
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
James J. Gibson
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
Jeff Hawkins
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Basil Hiley
Art Hobson
Jesper Hoffmeyer
Don Howard
John H. Jackson
William Stanley Jevons
Roman Jakobson
E. T. Jaynes
Pascual Jordan
Eric Kandel
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Daniel Koshland
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
Karl Lashley
David Layzer
Joseph LeDoux
Gerald Lettvin
Gilbert Lewis
Benjamin Libet
David Lindley
Seth Lloyd
Werner Loewenstein
Hendrik Lorentz
Josef Loschmidt
Alfred Lotka
Ernst Mach
Donald MacKay
Henry Margenau
Owen Maroney
David Marr
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
N. David Mermin
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Vernon Mountcastle
Emmy Noether
Donald Norman
Travis Norsen
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Wilder Penfield
Roger Penrose
Steven Pinker
Colin Pittendrigh
Walter Pitts
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Zenon Pylyshyn
Henry Quastler
Adolphe Quételet
Pasco Rakic
Nicolas Rashevsky
Lord Rayleigh
Frederick Reif
Jürgen Renn
Giacomo Rizzolati
A.A. Roback
Emil Roduner
Juan Roederer
Jerome Rothstein
David Ruelle
David Rumelhart
Robert Sapolsky
Tilman Sauer
Ferdinand de Saussure
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Sebastian Seung
Thomas Sebeok
Franco Selleri
Claude Shannon
Charles Sherrington
Abner Shimony
Herbert Simon
Dean Keith Simonton
Edmund Sinnott
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Teilhard de Chardin
Libb Thims
William Thomson (Kelvin)
Richard Tolman
Giulio Tononi
Peter Tse
Alan Turing
C. S. Unnikrishnan
Francisco Varela
Vlatko Vedral
Vladimir Vernadsky
Mikhail Volkenstein
Heinz von Foerster
Richard von Mises
John von Neumann
Jakob von Uexküll
C. H. Waddington
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
Herman Weyl
John Wheeler
Jeffrey Wicken
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Günther Witzany
Stephen Wolfram
H. Dieter Zeh
Semir Zeki
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
The Information Philosopher

Watch Our 50 I-Phi Lectures

See Bob's talk on the Brain in Olivier Wright's PSI project on Free Will

Read Our I-Phi Books

See Bob's iTV-Studio Design

Bob's Desktop Video Group
has been "helping communities communicate" for forty years

What is Information? Follow the blue hyperlink or look in the Introduction menu above. How is information created?

Answering this most profound question in philosophy and in science gives us plausible answers to some of the "great questions" of all time about the nature of reality.

The answer tells us how information structures initially formed and how today they are being continuously created.

Information philosophy has shown that novelty in the universe ("something new under the sun") requires a temporal process that depends first on the existence of new possibilities and then second on the selection or choice of one actual outcome.

This irreversible temporal process decreases the physical entropy locally, requiring a compensating increase in global entropy to satisfy the second law of thermodynamics.

There is a deep relationship between immaterial information and material entropy that John Wheeler calls "It from Bit."

These two-step or two-stage temporal processes explain not only the cosmic creation process, but also at least five other great problems in science and in philosophy.

They include the two-step process of biological evolution, chance variations or mutations in the genetic code followed by natural selection of those with greater reproductive success, and the two-stage model of freedom of the human will, first random alternative possibilities followed by an adequately determined practical or moral choice to make one actual.

Claude Shannon's theory of the communication of information also involves these two steps or stages (see the Shannon principle). The amount of information communicated depends on the number of possible messages. With eight possible messages, Shannon says one actual message communicates three bits of information (23 = 8).

The "weird" phenomenon of entanglement does not communicate information at faster than light speed as mistakenly thought. It causes two widely separated quantum particles, each with two random possible states, to coordinate their collapse into two actual perfectly correlated states.

Our model of the mind as an experience recorder and reproducer stores information in random possible synapses of neural networks and recalls actual memories from these wired Hebbian assemblies.

These six examples (and possibly many more) of solving problems with information creation are why I call myself the information philosopher and encourage others to become information philosophers.

You may need years of graduate and post-graduate education beyond my introductory material to help solve these problems. But as I go "beyond language and logic" (using both of course) in these few thousand web pages and in my books, I hope to be around for a few more years to answer your questions about information philosophy. You can email me at bobdoyle@informationphilosopher.com.

Note that just as language philosophy is not the philosophy of language, information philosophy is not the philosophy of information. Luciano Floridi is a philosopher of information. He studies the ethical use of information technology, including the spread of misinformation and disinformation.

I use immaterial information and material information structures to explain problems in philosophy and physics. My training is in astrophysics and quantum physics, though all my life I've been reading the works of philosophers in the left navigation column. My description in the 1954 Classical High School yearbook noted "of science and logic he chatters."

I understand the quantum wave function Ψ as pure information. It is the mathematical solution to Erwin Schrödinger's wave equation. Understanding it as pure information can help to clarify what Richard Feynman called the one and only mystery in quantum mechanics.

Understanding the quantum wave function Ψ as pure information will help us to develop an information interpretation of quantum mechanics.

The square of the wave function Ψ2 gives us the exact probabilities of different possibilities, which are objectively real, if immaterial. These multiple possibilities for a property of a quantum object when it is observed and measured bothered Albert Einstein, who wanted physical properties to exist independently of observations (they don't), to be determined by physical conditions in the immediate locality of the object and not be "influenced" by objects he thought were widely separated (they are). He called this "spooky action at a distance."

As Feynman put the one mystery of the two-slit experiment,

"The question now is, how does it really work? What machinery is actually producing this thing? Nobody knows any machinery."

As I see it, the one mystery is how quantum waves of abstract and immaterial information going through the two slits can cause the motions and create the properties of concrete and material particles landing on the distant screen. Understanding this mystery will better explain both Feynman's two-slit experiment and Einstein's entanglement.

An information explanation of the cosmic creation process shows how the expansion of the universe opened up new possibilities for different possible futures. My cosmological work is based on suggestions made by Arthur Stanley Eddington in the 1930's and by my Harvard colleague David Layzer in the 1970's.

Multiple possible messages correspond to multiple possible futures. If there is only one possibility, there is only one possible future. Some scientists (e.g., Seth Lloyd) think that the total information in the universe is a conserved quantity, just like the conservation of energy and matter. The "block universe" of special relativity and four-dimensional space-time is interpreted by some as the one possible future that is "already out there."

This flawed idea of a fixed amount of information in the universe supports the idea of Laplace's demon, a super-intelligent being who knows the positions and velocities of all the particles in the universe, one who could use Newton's laws of classical mechanics to know all the past and future of the universe. Such a universe is known as deterministic, pre-determined by the information at the start of the universe, or pre-ordained by an agent who created the universe.

This conservation of total information since time zero also supports the much older idea of an omniscient and omnipotent God with foreknowledge of the future, which threatens the idea of human free will. Logically speaking, a god can not be both omniscient and omnipotent.

In our study of how Albert Einstein invented most of quantum mechanics a decade before Werner Heisenberg, we showed that Einstein saw the existence of ontological chance whenever electromagnetic radiation interacts with atoms and molecules. This means that many future events, like Aristotle's famous "sea battle," are irreducibly contingent. Future events cannot be known until that future time when they either do or do not occur. The statement "the sea-battle will occur tomorrow" is neither true nor false, challenging Aristotle's bivalent logic and the "excluded middle." A contingent future means an omniscient being can not exist.

The indeterminism of quantum mechanics invalidates the idea of physical determinism as well as the idea of an omniscient being. Our work on free will limits indeterminism to the first "free" stage, where it helps to generate alternative possibilities (new thoughts), and our model requires an adequate determinism in the second "will" stage, to ensure that our actions are caused by our motives, desires, and feelings. First "free", then "will."

The great scientist and philosopher of biology Ernst Mayr described evolution as a "two-step process" involving chance.

A Two-Step Process
In his 1988 Toward a New Philosophy of Biology, Mayr wrote...
Evolutionary change in every generation is a two-step process: the production of genetically unique new individuals and the selection of the progenitors of the next generation. The important role of chance at the first step, the production of variability, is universally acknowledged, but the second step, natural selection, is on the whole viewed rather deterministically: Selection is a non-chance process.

A Two-Stage Model
The great philosopher of mind William James described free will as what I have called a "two-stage" process. He wrote...

And I can easily show...that as a matter of fact the new conceptions, emotions, and active tendencies which evolve are originally produced in the shape of random images, fancies, accidental out-births of spontaneous variation in the functional activity of the excessively instable human brain, which the outer environment simply confirms or refutes, adopts or rejects, preserves or destroys, - selects, in short, just as it selects morphological and social variations due to molecular accidents of an analogous sort.

In my 2011 book Free Will, I report over two-dozen other philosophers and scientists who independently invented this two-stage model of free will, both before and after my independent idea while a graduate student at Harvard in the 1970's.

A "Two-Bit" Information Explanation for Quantum Entanglement
We examine what information is being created by entanglement and whether and how it is communicated, whether it is meaningful, and if the information is valuable. We also hope to debunk various extravagant claims about quantum weirdness.

Today's standard entanglement studies are based on an experimental apparatus suggested in 1952 by David Bohm to explore the possibilities of "hidden variables" that could explain Albert Einstein's life-long concern about "spooky actions at a distance" and the so-called Einstein-Podoldsky-Rosen paradox of 1935.

Bohm's "hidden variable" proposal described a

"molecule of total spin zero consisting of two atoms, each of spin one-half. The two atoms are then separated by a method that does not influence the total spin. After they have separated enough so that they cease to interact, any desired component of the spin of the first particle (A) is measured. Then, because the total spin is still zero, it can immediately be concluded that the same component of the spin of the other particle (B) is opposite to that of A."

A possible experimental apparatus to realize Bohm's proposal emits entangled particles in opposite directions. The particles could be Bohm's atoms, or simply electrons, heading toward Stern-Gerlach devices that measure electron spins as up or down. An experiment with material particles was never realized. Today experiments are done with photons and polarizers measuring the photon spin direction. But the quantum mechanics is the same.

The spin of an atom can be up or down. An S-G device measurement provides a single bit of information, 1 or 0. We traditionally give the experimenters measuring particles A and B the names Alice and Bob. When Alice observes an up particle, she gets the bit 1. Instantly, Bob observes a down particle and gets the bit 0. These are well-established experimental facts.

Once the two particles have separated to a great distance, the naive theory is that one bit of information about Alice's up particle must have been "communicated" to Bob's particle so it can quickly adjust its spin to down. Or more simply, that a "hidden variable" travels at faster than light speed to "act" on particle B, causing it to have spin down. These theories are flawed, leading to claims that entanglement "connects" everything instantly in a "holistic" universe.

My Ph.D. thesis at Harvard in the 1960's solved the Schrödinger equation for the wave function Ψ12 of a two-atom hydrogen molecule, exactly what David Bohm proposed in the 1950's to explain "hidden variables." This provided me with a great insight into entanglement.

Before a measurement, quantum objects like hydrogen atoms, electrons, or photons have two possible spin states. A measurement makes one of the two possible states an actual state. The founders of quantum mechanics described this process in terms that remain controversial a century later. Werner Heisenberg said the new state is indeterministic, with possibilities, one of which comes randomly into "existence." Einstein hoped for a return to classical deterministic physics, in which objects have a "real" existence before we measure them.

The quantum mechanical wave function describing a single-particle spin is a linear combination (also called a superposition) of an up state [↑] and a down state [↓].

Ψ = (1/√2) [↑] ± (1/√2) [↓]

The (1/√2) coefficients are squared to give us the probability 1/2 of the particle being found in either the up or down state. Half the time we find [↑], the other half [↓].

A measurement is said to "select" one of the states, or to "project" a particle into one state. The wave-function Ψ is said to "collapse" into one of the states. It is another two-step or two-stage process, first possibilities, then selection of one to be actual.

The entanglement apparatus entangles two quantum particles so we need a two-particle quantum wave function Ψ12 and its a bit more complicated.

Ψ12 = (1/√2) [ ↑ (1) ↓ (2) - ↓ (1) ↑ (2) ]     

We can simplify the notation

| ψ12 > = 1/√2) | ↓ > - 1/√2) |↑ >

Instead of one particle found randomly in an up or down state, we find the two particles half the time in an up-down state and the other half in a down-up state.

Note that each individual particle is still randomly in an up or down state, but the joint condition of opposite spin states is not random at all! The states are certain to be perfectly correlated. Quantum mechanics exactly predicts the outcomes of entanglement experiments. But is this enough of an explanation? Can we find a causal description of what's going on?

The apparatus entangling the two particles sits in the center between Alice and Bob's measuring devices. It is also in the past light cone of their measurements so is the most likely candidate for a contributing cause. We call it the causal center (CC).

Alice ← Causal Center → Bob

Our model of a "common cause" explaining entanglement is a causal chain of events that begins with the local causes in the entangling apparatus that establishes the initial symmetry of the entangled particles. The particles are put into a spherically symmetric two-particle quantum state Ψ12 that Erwin Schrödinger said cannot be represented as a product of two independent single-particle states Ψ1 and Ψ2.

The causal center sends the particles off in the spherically symmetric state with total spin zero to Alice and Bob. This total spin zero is conserved as a constant of the motion.

The final causal interactions in our "causal chain" of events are the two measurements by Alice and Bob, which create two bits of digital information. As long as local measurements are made at the same pre-agreed upon angle their planar symmetry will maintain total spin zero, the particles will have correlated opposite spin states up-down or down-up, and individual spin states will be randomly up or down. Should measurements at A and B differ by angle θ, perfect correlations will be reduced by cos2θ, as predicted by quantum mechanics.

When Alice measures her quantum particle, let's say up, the two-particle wave function Ψ12 collapses and instantly her distant partner Bob's particle is projected into the correlated down state, conserving total spin. This is not because any actionable information is communicated from Alice to Bob. There is no "spooky action at a distance."

The two bits of information that "came into existence" were not present in the entangled particles as they traveled to Alice and Bob. They were created by the two measurements.

Does entanglement have something to do with the one mystery of quantum mechanics, how the abstract wave function can predict the locations or properties of concrete material particles without any causal power, without any known machinery (as Richard Feynman said), to move them? Yes, but there's more to it. Here there's a conservation principle at work.

No information at all is "communicated" faster than light between Alice and Bob as naive theories about entanglement suggest. Nor is information traveling faster than light from the causal center to Alice and Bob, carried by the two entangled particles. If the particles are photons, they travel at the speed of light; if electrons or atoms, then below light speed.

The entangling apparatus at the causal center sends the entangled particles off to Alice and Bob, each particle capable of producing one of the two bits of information. Alice and Bob share this information or knowledge "at a distance." But because each of their sequences of successive bits is quantum random, there is no "communication" of meaningful information going between Alice and Bob.

The popular science-fiction idea that entangled particles connect everything in the universe, that telepathic communication of meaningful messages are being exchanged between different galaxies, even some galaxies billions of light years away, is simply nonsense.

Without a common cause coming from a causal center between Alice in Andromeda and Bob (me!) in the Milky Way no "entangling" cosmic connection is possible!

The sad truth about quantum entanglement is that there is nothing at all going from Alice to Bob or from Bob to Alice. Even sadder, what is going from the common cause in the center to both Alice and Bob is a string of perfectly random bits, carrying exactly zero information!

Despite these random bit strings individually containing no information, their perfect correlation turns out to be extremely valuable for encrypting and decrypting secure communications between Alice and Bob later over ordinary communication channels!

Quantum Cryptography and Quantum Key Distribution.
Alice's measurement sequences appear to her to be completely random, with approximately equal numbers of 1's and 0's, approaching equality for longer bit sequences, like this.

00010011011110101100011011000001

And Bob's sequence looks to him to be equally random, with 1's and 0's approaching 50/50.

11101100100001010011100100111110

Should Alice send her bit sequence to Bob (over ordinary channels) for comparison, he finds when he lines the bit strings up with one another, they are perfectly anti-correlated. Where Alice measured a 1, Bob measures 0, and vice versa.  This is explained by the condition of conserved total spin coming from the causal center, so I call it a common cause.

00010011011110101100011011000001
11101100100001010011100100111110

Although they contain no information, these random but perfectly correlated bit sequences are perfect for use as a one-time pad or "key" for encrypting coded messages. And the sequences have not been "communicated" or "distributed" over an ordinary communication channel. They have been created independently and locally at Alice and Bob in a secure way that is invulnerable to eavesdroppers, solving the problem of quantum key distribution (QKD).

As long as Alice and Bob measure at the same angle, the spherically symmetric wave function with total spin conserved gives them the symmetric result. If their angles differ by θ their results will vary as cos²θ, which the violation of John Bell's theorem and his inequalities has shown. 

Beyond Logical Positivism and Language Philosophy to Information Philosophy

Answering deep philosophical questions with words and concepts has sadly been a failure. We need to get behind the words to the underlying information structures, material, linguistic, and mental as well as information communication processes between some structures, especially living things.

Although analytic language philosophy is still widely taught, it has made little progress for decades. Language philosophers solve (or dis-solve!) problems in philosophy by an analysis of language, using verbal arguments with words and concepts. They discover (rediscover) the same ancient problems, forever republishing old concepts with new names and acronyms.

An information philosopher studies the origin and evolution of information structures, the foundations for all our ideas, solving the problem of knowledge.

Information philosophy is a dualist philosophy, both materialist and idealist. It is a correspondence theory, explaining how immaterial ideas represent material objects, especially in the brain and in the mind.

In a deterministic or "block" universe, information is constant. Logical and mathematical philosophers follow Gottfried Leibniz and Pierre-Simon Laplace, who said a super-intelligent being who knew the information at one instant would know all the past and future. They deny the obvious fact that new information can be and has been created.

An information structure is an object whose elementary particle components have been connected and arranged in an interesting way, as opposed to being dispersed at random throughout space like the molecules in a gas. Information philosophy explains who or what is doing the arranging.

A gas of microscopic material particles in equilibrium is in constant motion, the motion we call heat. But its macroscopic properties, like pressure, temperature, and volume, its total matter and energy, are unchanging. It is said to be in equilibrium and have maximum possible entropy, or disorder. It contains minimal, possibly zero, internal information, apart from information about the structures of the atoms and molecules.

When the second law of thermodynamics was discovered in the nineteenth century, physicists predicted that increasing entropy would destroy all information, and the universe would end in a "heat death." That is not happening.

Many philosophers, philosophers of science, and scientists themselves, still see deterministic "laws of nature" as models for their work. The great success of Newtonian mechanics inspires them to develop mechanical, material, and energetic explanations for biological and mental processes.

In recent decades some have gone beyond classical mechanical laws to explain evolution in terms of the laws of thermodynamics and the increase of complexity. Ilya Prigogine argued that non-equilibrium thermodynamics can bring "order out of chaos." But it takes more than non-equilibrium physics. It takes the expansion of the universe.

Information is neither matter nor energy, although it needs matter to be embodied and energy to be communicated. Why should it become the preferred basis for all philosophy?

As most everyone knows, matter and energy are conserved. This means that there is just the same total amount of matter and energy today as there was at the universe origin.

But then what accounts for all the change that we see, the new things under the sun?
It is information, which is not conserved and has been increasing since the beginning of time, despite the second law of thermodynamics, with its increasing entropy, which in a closed universe destroys both order and information. But our universe is open and expanding! And new information is continuously being created!

The Cosmic Creation Process
Many philosophers and scientists mistakenly think the universe must have begun with a vast amount of information, so that the information structures we have now are left over after the increasing entropy of the second law has destroyed much of the primordial information.

But the physics of the early universe, famously the first three minutes according to Steven Weinberg, shows us a state near the maximum possible entropy for the earliest moments.

How can the universe have begun in equilibrium - near maximal disorder and minimal informational, yet today be in the high state of information and order we see around us. This I have called the fundamental question of information philosophy.

The answer to this fundamental question was given to me by my Harvard colleague and mentor David Layzer in the 1970's. In short, it is the expansion of the universe, which continually increases the space available to the limited number of particles, giving them more room and more possibilities to arrange themselves into interesting information structures. This is the basis of a cosmic creation process for all interesting things.

Cosmic creation is only possible because the expansion of space increases faster than the gas particles can get back to equilibrium, making room for the growth of entropy and disorder, but also order and information.

The entropy of the early universe was maximal, in equilibrium, for its time, but it was tiny compared to the actual entropy today, which is smaller than today's maximum possible entropy, making room for lots more information structures (negative entropy).

I have now found that this powerful insight was first seen by Arthur Stanley Eddington in his 1934 book New Pathways in Science, where he said (p.68) "The expansion of the universe creates new possibilities of distribution faster than the atoms can work through them."

As pointed out to me by Layzer in 1975, Eddington's arrow of time (for him, the direction of entropy increase) points not only to increasing disorder (positive entropy) but also to increasing information (negative entropy).

At the earliest times, purely physical forces (electromagnetic, nuclear, and gravitational) changed the arrangement of the most fundamental particles of matter and energy, quarks, electrons, gluons, and photons, into information structures like atoms and molecules, and much later into planets, stars and galaxies.

Billions of years later, living things became active information structures. Living things control the flow of matter and energy through themselves and do their own arranging of their matter and energy constituents!

New immaterial information is forever emerging. We human beings are creating new ideas!

Purely physical objects like planets, stars, and galaxies are passive information structures, entirely controlled by fundamental physical forces - the strong and weak nuclear forces, electromagnetism, and gravitation. These objects do not control themselves.

Living things, you and I, are active dynamic growing information structures, forms through which matter and energy continuously flow. And the communication of biological information controls those flows!

Before life as we know it, some information structures blindly replicated their information. Some of these replication processes were fatal mistakes, but very rarely the mistake was an improvement, with better reproductive success. In life today, those random errors produce some of the variations followed by natural selection which adapts living things to their environments.

Even the smallest living things develop behaviors, sensing information about and reacting to their environment, sending signals between their parts (cells, organelles) and to other living things nearby. These behaviors can be interpreted as intentions, goals, and agency, introducing purpose into the universe.

The goals and purposes of living things are not the "final goal" or purpose of Aristotle's Metaphysics that he called "telos" (τέλος). Teleology is the idea that there is a cosmic purpose that preceded the creation of the universe and which points toward an end goal. Teleology underlies many theologies, in which a creator God embodies the telos, just as a sculptor previsualizes the statue within a block of marble. Perhaps the best known is Teilhard de Chardin whose end goal he called the "Omega Point" is Jesus Christ. In many religions, the creator predestines or predetermines all the events in the universe, a theological idea that fit well with the mechanical and deterministic laws of Nature discovered by Isaac Newton in the seventeenth-century age of enlightenment.

The biologist Colin Pittendrigh coined the term teleonomy to distinguish the purpose we see in all living things from a hypothetical teleological purpose from before the origin of the universe. Jacques Monod and Ernst Mayr also stressed the important distinction between teleonomy and teleology.

Information is the modern spirit, the ghost in the machine, the mind in the body. It is the soul. When we die, it is information that perishes, unless the future preserves it. The matter remains.

Information philosophers think that if we don't remember the past, we don't deserve to be remembered by the future. This is especially true for the custodians of knowledge.

In the natural sciences the most important references are usually the most recent. In the humanities and social sciences the opposite is often true. The earliest references were invented ideas that became traditional beliefs, now deeply held without further justification.

This website is not based on the work of a single thinker. It includes the work of over five hundred philosophers and scientists, critically analyzed over six decades by this information philosopher, with extensive quotations from the original thinkers and PDFs of major parts of their work (sometimes in the original language).

Information philosophy can explain the fundamental metaphysical connection between materialism and idealism. It replaces the determinism and metaphysical necessity of eliminative materialism and reductionist naturalism with metaphysical possibilities.

Unactualized possibilities exist in minds as immaterial ideas. They are the alternative actions and choices that are the basis for our two-stage model of free will.

The existence (perhaps metaphysical) of alternative possibilities explains how both new ideas and new species arise by chance, the consequence of quantum indeterminism

Neurobiologists question the usefulness of quantum indeterminism in the brain and mind. But it is the sometimes random firing of particular neurons and their subsequent wiring together that records an individual's experiences, experiences distinctly different in ways that contribute to every unique "self," - what it's like to be me.

Faced with a new experience, the experience recorder and reproducer (ERR) causes some neurons to "play back" those encoded past experiences that are similar in some way to the current experience. The "playback" is complete with the emotions that were attached to the original experiences. Memory of and learning from diverse past experiences provides the context that adds "meaning" to the current experience. The number of past experiences recalled may be very large.

William James described this as a "blooming, buzzing confusion." He called for us to focus attention on the alternative possibilities in his "stream of consciousness." These possibilities are the past experiences of the audience members whose hands are raised in Bernard Baars's "Theater of Consciousness," that give them something relevant to add to the conversation.

Some information enthusiasts claim that information is the fundamental stuff of the universe. It is not. The universe is fundamentally composed of discrete particles of matter and energy. Information describes the arrangement of the matter. Where the arrangement is totally random, there is no information. The organized information in living things has a purpose, to survive and to increase.

Information is the form in all discriminable concrete objects as well as the content in non-existent, merely possible, thoughts and other abstract entities. Information is the disembodied, de-materialized essence of anything.

Perhaps the most amazing thing about information philosophy is its discovery that abstract and immaterial information can exert an influence over concrete matter, explaining how mind can move body, how our thoughts can control our actions, deeply related to the mystery of how the quantum wave function (randomly) controls the probabilities of locating quantum particles.

It is immaterial information in the collapse of the two-particle wave function Ψ12 that ensures perfectly correlated measurements no matter how far entangled particles are separated.

But the random generation of alternative possibilities for thoughts and actions does not mean that our deliberations themselves are random, provided that the deliberative choice of one possible action is adequately determined.

For example, compatibilist philosophers argue that libertarians on free will cannot see that if there is a random element involved in the generation of possible actions, agents would have no control over such actions and cannot be held morally responsible. But on the contrary, if an agent chooses to flip a coin to decide on an action, that choice can still be a deliberative act, and the agent can accept full responsibility for choosing either outcome of the coin flip.

Information philosophy goes beyond a priori logic and its puzzles, beyond analytic language and its paradoxes, beyond philosophical claims of necessary truths, to a contingent physical world that is best represented as made of dynamic, interacting information structures.

The creation of new information structures exposes the error of determinism. In a deterministic universe there is no increase of information. All the past, present, and future information is present to the eyes of a super-intelligence, as Pierre-Simon Laplace argued.

Isaac Newton's classical mechanical laws of motion are not only deterministic, they are reversible in time. It is believed by many that if time could be reversed, the entire universe would proceed back in time to its earliest state, like a motion picture played backwards.

Information philosophy has discovered the origin of irreversibility in the early work on quantum mechanics by Albert Einstein. Quantum indeterminism and irreversibility in turn contribute to the origin of information structures, which we have found in the work of Arthur Stanley Eddington and David Layzer. Thirdly, quantum indeterminism and the creation of information structures are the bases for our two-stage model of free wil, which we trace back to the thought of William James.

Information is said by some to be a conserved quantity, just like matter and energy. This is not the case. Determinism is a false belief, originating either in the tragic idea that an omniscient and omnipotent God (or in the Newtonian idea that unbreakable laws of nature) completely control every event, so there can be no human freedom in a completely determined world.

Indeed, belief in determinism is the modern residue of the traditional belief in an overarching explanation - a determinative reason - for everything.

Knowledge can be defined as information in minds - a partial isomorphism of the information structures in the external world. Information philosophy is a correspondence theory.

Sadly, there is no isomorphism, no information in common, between words and objects. As the great Swiss linguist and structuralist philosopher Ferdinand de Saussure pointed out, the connection between most signifiers (words and other symbols) and the things signified (objects and concepts) is arbitrary, a connection established only by cultural convention. This arbitrariness accounts for much of the failure of analytic language philosophy in the past century.

Although language is an excellent tool for human communications, it is arbitrary, ambiguous, and ill-suited to represent the world directly. Human languages can not "picture" reality, despite the hopes of early logical positivists like Ludwig Wittgenstein.

Information is the lingua franca of the universe.

The extraordinarily sophisticated connections between words and objects are made in human minds, mediated by the brain's experience recorder and reproducer (ERR). Words stimulate neurons to start firing and to play back those experiences that include relevant objects.

Neurons that were wired together in our earliest experiences fire together at later times, contextualizing our new experiences, giving them meaning. And by replaying emotional reactions to similar earlier experiences, it makes then "subjective experiences," giving us the feeling of "what it's like to be me" and solving the "hard problem" of consciousness.

Beyond words, a dynamic information model of an information structure in the world is presented immediately to the mind as a simulation of reality experienced for itself.

Without words and related experiences previously recorded in your mental experience recorder, we could not comprehend words. They would be mere noise, with no meaning.

CAT

By comparison, a diagram, a photograph, an animation, or a moving picture can be seen and mostly understood by human beings, independent of their native tongue. (Right click on the cat movie to show controls that make it play and pause)

The basic elements of information philosophy are dynamic models of information structures. They go far beyond logic and language as a representation of the fundamental, metaphysical, nature of reality.

Visual and interactive models "write" directly into our mental experience recorders.

Computer animated models can incorporate all the laws of nature, from the differential equations of quantum physics to the myriad information processes of biology.

Computer simulations are not only our most accurate knowledge of the physical world, they are among the best teaching tools ever devised. We can transfer knowledge non-verbally to coming generations in most of the world's population via the Internet and nearly ubiquitous smartphones.

Consider the dense information in Drew Berry's real-time animations of molecular biology. These are the kinds of dynamic models of information structures that we believe can best explain the fundamental nature of reality - "beyond logic and language."

If you think about it, everything you know is pure abstract information. Everything you are is an information structure, a combination of matter and energy that embodies, communicates, and most important, processes your information. Everything that you value contains information.

And while the atoms, molecules, and cells of your body are important, many only last a few minutes and most are completely replaced in just a few years. But your immaterial information, from your original DNA to your latest experiences, will be with you for your lifetime.

You are a creator of new information, part of the cosmic creation process. Your free will depends on your unique ability to create freely generated thoughts, multiple ideas in your mind as alternative possibilities for your willed decisions and responsible actions.

Anyone with a serious interest in philosophy should understand how information is created and destroyed, because information is much more fundamental than the logic and language tools philosophers use today. Information philosophy goes "beyond logic and language."

Information is the sine qua non of meaning. This I-Phi website aims to provide a deep understanding of information that should be in every philosopher's toolbox.

We will show why information should actually be the preferred basis for the critical analysis of current problems in a wide range of disciplines - from information creation in cosmology to information in quantum physics, from information in biology (especially evolution) to psychology, where it offers a solution to the classic mind-body problem and the problem of consciousness. And of course in philosophy, where failed language analysis can be replaced or augmented by immaterial information analysis as a basis for justified knowledge, objective values, human free will, and a surprisingly large number of problems in metaphysics.

Above all, information philosophy hopes to replace beliefs with knowledge. Instead of the primitive idea of an other-worldly creator, we propose a comprehensive explanation of the creation of this world that has evolved into the human creativity that invents such ideas.
The "miracle of creation" is happening now, in the universe and in you and by you.

A Personal Note
As this information philosopher approaches his end of life, of physical and material life, of mental and ideal life, and especially of creative life, it is unlikely he/I will get to finish five more books on information philosophy, some of whose draft material is found on this website.

So I added a brief summary of my forthcoming fifth book on information philosophy, Mind: The Experience Recorder and Reproducer to the 61st version of my I-Phi home page. (You can see the past twenty years of versions of this page preserved in SkyBuilders TimeLines persistent archive, as suggested to me and my son Derek by Tim Berners-Lee in the late 1990's. You can also see the version of any of the site's 2500 pages at a particular date and time by pressing the @ key and selecting a time.)

This page began with the fundamental question of information philosophy - how is information created? We answered that information structures were created shortly after the universe origin by a two-step process, first the opening of new possible cells in phase space caused by the universe expansion, followed by the adequately determined four physical forces (strong and weak nuclear, electromagnetic, and gravitational) forming atoms, stars, and galaxies, despite the second law of thermodynamics and the universe being initially in a state of chaos and disorder, of maximum entropy for that time.

We now know that the entropy of the early universe was much smaller than the actual entropy today. More importantly, today's maximum possible entropy is much larger still, making room for the large amount of negative entropy (information) in today's universe.

Our Mind book proposes a new model for how the human mind has come to know the cosmic creative process. Our model stands in opposition to the dominant mind model for the past eighty years in psychology and cognitive science, namely the computational theory of mind.

We will argue that wo/man is not a machine and the brain is not a (digital) computer. The metaphor that mind can be viewed as immaterial software running in the brain's material (biological) hardware is attractive but flawed.

We will show how and where information is stored in the brain and how it is recalled, but without address buses connecting to digital data storage and without central or parallel processors "operating" on the data. We will show how the stored information is not represented as viewable syntactical structures. There are no "neural computations" on "neural representations." Instead, past experiences are reproduced or re-presented, with all the "wired-together" neurons of an experience firing again. We call our mind model the Experience Recorder and Reproducer.

See the entries under the Mind menu for further details...

I want to point out that many ideas I now think of as my own had their origins in my reading the works of all the hundreds of philosophers and scientists in the left navigation column. I've been most fortunate to have had the time to read them all. My work on cosmic creation started with ideas of David Layzer and Arthur Stanley Eddington. Work on my two-stage model of free will began with Eddington and with William James. My theory that the universe geometry started flat and has always been flat depended on the critical observations of Walter Baade. I saw the origin of irreversibility in the statistical mechanics of Ludwig Boltzmann's "molecular disorder" and the quantum randomness in Albert Einstein's directions of photon emission. And the critical idea that information in the mind is stored in neurons that have been wired together by our past experiences depended on the insights of neurophysiologist Donald Hebb. I owe so much to them all.

But what is information? How is it created? Why is it a better tool for examining philosophical problems than traditional logic or linguistic analysis? And what are some examples of classic problems in philosophy, in physics, and in metaphysics with information philosophy solutions?

What problems has information philosophy solved?
Why has philosophy made so little progress? Perhaps it's because philosophers prefer problems, while scientists seek solutions? Must a philosophical problem once solved become science and leave philosophy? Bertrand Russell thought so.

Russell said:

"as soon as definite knowledge concerning any subject becomes possible, this subject ceases to be called philosophy, and becomes a separate science...while those only to which, at present, no definite answer can be given, remain to form the residue which is called philosophy."
This information philosopher thinks not.

In order for problems to remain to remain philosophical, interested philosophers should consider our proposed information-based solutions as part of the philosophical dialogue.

Among the proposed solutions to classic philosophical problems are:

Information analysis also makes significant progress on a number of the classic problems in metaphysics, many of these virtually unchanged since they were identified as puzzles and paradoxes over two millennia ago, such as The Statue and Lump of Clay, The Ship of Theseus, Dion and Theon, or Tibbles, the Cat, The Growing Problem, The Debtor's Paradox, The Problem of the Many, and The Sorites Problem.

Among the metaphysical problems with suggested information philosophy solutions are:

It also turns out that the methodology of information philosophy can be productively applied to some outstanding problems in physics. Philosophers of science might take an interest in the proposed information-based solutions to these problems in the "foundations" of physics.

What is information?

A common definition of information is the act of informing - the communication of knowledge from a sender to a receiver that informs (literally shapes) the receiver. Often used as a synonym for knowledge, information traditionally implies that the sender and receiver are human beings, but many animals clearly communicate. Information theory studies the communication of information.

Information philosophy extends that study to the communication of information content between material objects, including how it is changed by energetic interactions with the rest of the universe.

We call a material object with information content an information structure. While information is communicated between inanimate objects, they do not process information, which we will show is the defining characteristic of living beings and their artifacts.

The sender of information need not be a person, an animal, or even a living thing. It might be a purely material object, a rainbow, for example, sending color information to your eye.

The receiver, too, might be merely physical, a molecule of water in that rainbow that receives too few photons and cools to join the formation of a crystal snowflake, increasing its information content.

Information theory, the mathematical theory of the communication of information, says little about meaning in a message, which is roughly the use to which the information received is put. Information philosophy extends the information flows in human communications systems and digital computers to the natural information carried in the energy and material flows between all the information structures in the observable universe.

A message that is certain to tell you something you already know contains no new information. It does not increase your knowledge, or reduce the uncertainty in what you know, as information theorists put it.

If everything that happens was certain to happen, as determinist philosophers claim, no new information would ever enter the universe. Information would be a universal constant. There would be "nothing new under the sun." Every past and future event could in principle be known by a god-like super-intelligence with access to that fixed totality of information (Laplace's Demon).

Physics tells us that the total amount of mass and energy in the universe is a constant. The conservation of mass and energy is a fundamental law of nature. Some mathematical physicists erroneously think that information should also be a conserved quantity, that information is a constant of nature. This includes some leading mathematical physicists.

But information is neither matter nor energy, though it needs matter to be embodied and available energy to be communicated. Information can be created and destroyed. The material universe creates it. The biological world creates it and utilizes it. Above all, human minds create, process, and preserve abstract information, the Sum of human knowledge that distinguishes humanity from all other biological species and that provides the extraordinary power humans have over our planet, for better or for worse.

Information is the modern spirit, the ghost in the machine, the mind in the body. It is the soul, and when we die, it is our information that perishes. The matter remains.

We propose information as an objective value, the ultimate sine qua non.

Information philosophy claims that man is not a machine and the brain is not a computer. Living things process information in ways far more complex, if not faster, than the most powerful information processing machines. What biological systems and computing systems have in common is the processing of information, as we must explain.

Whereas machines are assembled, living things assemble themselves. They are both information structures, patterns, through which matter and energy flows, thanks to flows of negative entropy (available energy) coming from the Sun and the expanding universe. And they both can create new information, build new structures, and maintain their integrity against the destructive influence of the second law of thermodynamics with its increasing positive entropy or disorder.

Biological evolution began when the first molecule replicated itself, that is, duplicated the information it contained. But duplication is mere copying. Biological reproduction is a much more sophisticated process in which the germ or seed information of a new living thing is encoded in a data or information structure (a genetic code) that can be communicated to processing systems that produce another instance of the given genus and species.

Ontologically random imperfections, along with the deliberate introduction of random noise, for example in sexual recombinations, in the processing systems produce the variations that are selected by evolution based on their reproductive success. Errors are not restricted to the genetic code, occurring throughout the development of each individual up to the present.

Cultural evolution is the creation and communication of new information that adds to the sum of human knowledge. The creation and evolution of information processing systems in the universe has culminated in minds that can understand and reflect on what we call the cosmic creation process.

How is information created?

Ex nihilo, nihil fit, said the ancients, Nothing comes from nothing. But information is no (material) thing. Information is physical, but it is not material. Information is a property of material. It is the form that matter can take. We can thus create something (immaterial) from nothing! But we shall find that it takes a special kind of energy (free or available energy, with negative entropy) to do so, because it involves the rearrangement of matter.

Energy transfer to or from an object increases or decreases the heat in the object. Entropy transfer does not change the heat content, it represents only a different organization or distribution of the matter in the body. Increasing entropy represents a loss of organization or order, or, more precisely, information. Maximum entropy is maximum disorder and minimal information.

As you read this sentence, new information is (we hope) being encoded/embodied in your mind/brain. Permanent changes in the synapses between your neurons store the new information. New synapses are made possible by free energy and material flows in your metabolic system, a tiny part of the negative entropy flows that are coursing throughout the universe. Information philosophy will show you how these tiny mental flows allow you to comprehend and control at least part of the cosmic information flows in the universe.

Cosmologists know that information is being created because the universe began some thirteen billion years ago in a state of minimal information. The "Big Bang" started with the most elementary particles and radiation. How matter formed into information structures, first elementary particles, then atoms, then the galaxies, stars, and planets, is the beginning of a story that ends with human minds emerging to understand our place in the universe.

The relation between matter and information is straightforward. The embodied information is the organization or arrangement of the matter plus the laws of nature that describe the motions of matter in terms of the fundamental forces that act between all material particles.

The relation between information and energy is more complex, and has led to confusion about how to apply mathematical information theory to the physical and biological sciences. Material systems in an equilibrium state are maximally disordered, have maximum entropy, no negative entropy, and no information other than the bulk parameters of the system.

In the case of the universe, the initial parameters were very few, the amount of radiant energy (the temperature) and the number of elementary particles (quarks, gluons, electrons, and photons) per unit volume, and the total volume (infinite?). These parameters, and their changes (as a function of time, as the temperature falls) are all the information needed to describe a statistically uniform, isotropic universe and its evolution.

Information philosophy will explain the process of information creation in three fundamental realms - the purely material, the biological, and the mental.

The first information creation was a kind of "order out of chaos," when matter in the early universe opened up spaces allowing gravitational attraction to condense otherwise randomly distributed matter into highly organized galaxies, stars, and planets. It was the expansion - the increasing space between material objects - that drove the universe away from thermodynamic equilibrium (maximum entropy and disorder) and in some places created negative entropy, a quantitative measure of orderly arrangements that is the basis for all information.

Purely material objects react to one another following laws of nature, but they do not in an important sense create or process the information that they contain. It was the expansion, moving faster than the re-equilibration time, and the four natural forces, especially gravitation, that were responsible for the new structures.

A qualitatively different kind of information creation was when the first molecule on earth to replicate itself went on to duplicate its information exponentially. Here the prototype of life was the cause for the creation of the new information structure. Accidental errors in the duplication provided variations in replicative success. Most important, besides creating their information structures, biological systems are also information processors. Living things use information to guide their actions.

With the appearance of life, agency and purpose appeared in the universe. Although some philosophers hold that life just gives us the "appearance of purpose."

The third process of information creation, and the most important to philosophy, is human creativity. Almost every philosopher since philosophy began has considered the mind as something distinct from the body. Information philosophy can now explain that distinction. The mind can be considered the immaterial information in the brain. The brain, part of the material body, is a biological information processor. The stuff of mind is the information being processed and the new information being created. As some philosophers have speculated,
mind is the software in the brain hardware.

Most material objects are passive information structures.

Living things are information structures that actively process information. They communicate it between their parts to build, maintain, and repair their (material) information structure, through which matter and energy flow under the control of the information structure itself.

Resisting the second law of thermodynamics locally, living things increase entropy globally much faster than non-living things. But most important, living things increase their information content as they develop. Humans learn from their experiences, storing knowledge in an experience recorder and reproducer (ERR).

Mental things (ideas) are pure abstractions from the material world, but they have control (downward causation) over the material and biological worlds. This enables agent causality. Human minds create information structures, but their unique creation is the collection of abstract ideas that are the sum of human knowledge. It is these ideas that give humanity unparalleled extraordinary control over the material and biological worlds.

It may come as a surprise for many thinkers to learn that the physics involved in the creation of all three types of information - material, biological, and mental - include the same two-step sequence of quantum physics and thermodynamics at the core of the cosmic creation process.

The most important information created in a mind is a recording of an individual's experiences (sensations). Recordings are played back (automatically and perhaps mostly unconsciously) as a guide to evaluate future actions (volitions) in similar situations. The particular past experiences reproduced are those stored in the brain located near elements of the current experience (association of ideas).
Just as neurons that fire together wire together, neurons that have been wired together will later fire together.

Sensations are recorded as the mental effects of physical causes.
Sensations are stored as retrievable information in the mind of an individual self. Recordings include not only the five afferent senses but also the internal emotions - feelings of pleasure, pain, hopes, and fears - that accompany an experience. They constitute "what it's like" for a particular being to have an experience.

Volitions are the mental causes of physical effects.
Volitions begin with 1) the reproduction of past experiences that are similar to the current experience. These become thoughts about possible actions and the (partly random) generation of other alternative possibilities for action. They continue with 2) the evaluation of those freely generated thoughts followed by a willful selection (sometimes habitual) of one of those actions.

Volitions are followed by 3) new sensations coming back to the mind indicating that the self has caused the action to happen (or not). This feedback is recorded as further retrievable information, reinforcing the knowledge stored in the mind that the individual self can cause this kind of action (or sometimes not).

Many philosophers and most scientists have held that all knowledge is based on experience. Experience is ultimately the product of human sensations, and sensations are just electrical and chemical interactions with human skin and sense organs. But what of knowledge that is claimed to be mind-independent and independent of experience?

Why is information better than logic and language for solving philosophical problems?

Broadly speaking, modern philosophy has been a search for truth, for a priori, analytic, certain, necessary, and provable truth.

But all these concepts are mere ideas, invented by humans, some aspects of which have been discovered to be independent of the minds that invented them, notably formal logic and mathematics. Logic and mathematics are systems of thought, inside which the concept of demonstrable (apodeictic) truth is useful, but with limits set by Kurt Gödel's incompleteness theorem. The truths of logic and mathematics appear to exist "outside of space and time." Gottfried Leibniz called then "true in all possible worlds," meaning their truth is independent of the physical world. We call them a priori because their proofs are independent of experience, although they were initially abstracted from concrete human experiences.

Analyticity is the idea that some statements, propositions in the form of sentences, can be true by the definitions or meanings of the words in the sentences. This is correct, though limited by verbal difficulties such as Russell's paradox and numerous other puzzles and paradoxes. Analytic language philosophers claim to connect the words with objects, material things, and thereby tell us something about the world. Some modal logicians (cf. Saul Kripke) claim that words that are names of things are necessary a posteriori, "true in all possible worlds." But this is nonsense, because we invented all those words and worlds. They are mere ideas.

Perhaps the deepest of all these philosophical ideas is necessity. Information philosophy can now tell us that there is no such thing as absolute necessity. There is of course an adequate determinism in the macroscopic world that explains the appearance of deterministic laws of nature, of cause and effect, for example. This is because macroscopic objects consist of vast numbers of atoms and their individual random quantum events average out. But there is no metaphysical necessity. At the fundamental microscopic level of material reality, there is an irreducible contingency and indeterminacy. Everything that we know, everything we can say, is fundamentally empirical, based on factual evidence, the analysis of experiences that have been recorded in human minds.

So information philosophy is not what we can logically know about the world, nor what we can analytically say about the world, nor what is necessarily the case in the world. There is nothing that is the case that is necessary and perfectly determined by logic, by language, or by the physical laws of nature. Our world and its future are open and contingent, with possibilities that are the source of new information creation in the universe and source of human freedom.

For the most part, philosophers and scientists do not believe in ontological possibilities, despite their invented "possible worlds," which are on inspection merely multiple "actual worlds." They are "actualists." This is because they cannot accept the idea of ontological chance. They hope to show that the appearance of chance is the result of human ignorance, that chance is merely an epistemic phenomenon.

Now chance, like truth, is just another idea, just some more information. But what an idea! In a self-referential virtuous circle, it turns out that without the real possibilities that result from ontological chance, there can be no new information. Information philosophy offers cosmological and biological evidence for the creation of new information in the universe. So it follows that chance is real, fortunately something that we can keep under control. We are biological beings that have evolved, thanks to chance, from primitive single-cell communicating information structures to multi-cellular organisms whose defining aspect is the creation and communication of information.

The theory of communication of information is the foundation of our "information age." To understand how we know things is to understand how knowledge represents the material world of embodied "information structures" in the mental world of immaterial ideas.

All knowledge starts with the recording of experiences. The experiences of thinking, perceiving, knowing, believing, feeling, desiring, deciding, and acting may be bracketed by philosophers as "mental" phenomena, but they are no less real than other "physical" phenomena. They are themselves physical phenomena.
They are just not material things.

Information philosophy defines human knowledge as immaterial information in a mind, or embodied in an external artifact that is an information structure (e.g., a book), part of the sum of all human knowledge. Information in the mind about something in the external world is a proper subset of the information in the external object. It is isomorphic to a small part of the total information in or about the object. The information in living things, artifacts, and especially machines, consists of much more than the material components and their arrangement (positions over time). It also consists of all the information processing (e.g., messaging) that goes on inside the thing as it realizes its entelechy or telos, its internal or external purpose.

All science begins with information gathered from experimental observations, which are themselves mental phenomena. Observations are experiences recorded in minds. So all knowledge of the physical world rests on the mental. All scientific knowledge is information shared among the minds of a community of inquirers. As such, science is a collection of thoughts by thinkers, immaterial and mental, some might say fundamental. Recall Descartes' argument that the experience of thinking is that which for him is the most certain.

Information philosophy is not the philosophy of information (the intersection of computer science, information science, information technology, and philosophy), just as linguistic philosophy - the idea that linguistic analysis can solve (or dis-solve) philosophical problems - is not the philosophy of language. Compare the philosophy of mathematics, philosophy of biology, etc.
The analysis of language, particularly the analysis of philosophical concepts, which dominated philosophy in the twentieth century, has failed to solve the most ancient philosophical problems. At best, it claims to "dis-solve" some of them as conceptual puzzles. The "problem of knowledge" itself, traditionally framed as "justifying true belief," is recast by information philosophy as the degree of isomorphism between the information in the physical world and the information in our minds. Information psychology can be defined as the study of this isomorphism.

We shall see how information processes in the natural world use arbitrary symbols (e.g., nucleotide sequences) to refer to something, to communicate messages about it, and to give the symbol meaning in the form of instructions for another process to do something (e.g., create a protein). These examples provide support for both theories of meaning as reference and meaning as use.

Note that just as language philosophy is not the philosophy of language, so information philosophy is not the philosophy of information. It is rather the use of information as a tool to study philosophical problems, some of which are today yielding tentative solutions. It is time for philosophy to move beyond logical puzzles and language games.

The Fundamental Question of Information Philosophy
Our fundamental philosophical question is cosmological and ultimately metaphysical.

What are the processes that create emergent information structures in the universe?

More simply,

How is information created in spite of the second law of thermodynamics?

Given the second law of thermodynamics, which says that any system will over time approach a thermodynamic equilibrium of maximum disorder or entropy, in which all information is lost, and given the best current model for the origin of the universe, which says everything began in a state of thermodynamic equilibrium some 13.75 billion years ago, how can it be that living beings are creating and communicating vast amounts of new information every day?

Why are we not still in that original state of equilibrium?

Broadly speaking, there are three major phenomena or processes that can reduce the entropy locally, while of course increasing it globally to satisfy the second law of thermodynamics. Two of these do it "blindly," the third does it with a built-in "purpose," or telos."

  1. Universal Gravitation
  2. Quantum Cooperative Phenomena (e.g., crystallization, the formation of atoms and molecules)
  3. Life
None of these processes can work unless they have a way to get rid of the positive entropy (disorder) and leave behind a pocket of negative entropy (order or information). The positive entropy is either conducted, convected, or radiated away as waste matter and energy, as heat, or as pure radiation. At the quantum level, it is always the result of interactions between matter and radiation (photons). Whenever photons interact with material particles, the outcomes are inherently unpredictable. As Albert Einstein discovered ten years before the founding of quantum mechanics, these interactions involve irreducible ontological chance.

Negative entropy is an abstract thermodynamic concept that describes energy with the ability to do work, to make something happen. This kind of energy is often called free energy or available energy.

In a maximally disordered state (called thermodynamic equilibrium) there can be matter in motion, the motion we call heat. But the average properties - density, pressure, temperature - are the same everywhere. Equilibrium is formless. Departures from equilibrium are when the physical situation shows differences from place to place. These differences are information.

The second law of thermodynamics then simply means that isolated systems will eliminate differences from place to place until all properties are uniformly distributed. Natural processes spontaneously destroy information. Consider the classic case of what happens when we open a perfume bottle.

In the late nineteenth century, Ludwig Boltzmann revolutionized thermodynamics with his kinetic theory of gases, based on the ancient assumption that matter is made up of collections of atoms. He derived a mathematical formula for entropy S as a function of the probabilities of finding a system in all the possible microstates of a system. When the actual macrostate is one with the largest number W of microstates, entropy is at a maximum, and no differences (information) are visible.

Boltzmann could not prove his "H-Theorem" about entropy increase. His contemporaries challenged a "statistical" entropy increase on grounds of microscopic reversibility and macroscopic recurrence (both problems solved by information philosophy). He could not prove the existence of atoms.

In the early twentieth century, Just before Boltzmann died, Albert Einstein formulated a statistical mechanics that put Boltzmann's law of increasing entropy on a firmer mathematical basis. Einstein's work predicted the size of miniscule fluctuations around equilibrium, which Boltzmann had expected. Einstein showed that entropy does not, in fact, continually increase. It can decrease randomly in short bursts of local higher densities or organized motions. Though quickly extinguished, Einstein showed that the occasionally correlated motions of invisible atoms explains the visible "Brownian motion" of tiny particles like seed pollen.

Einstein's calculations led to predictions that were confirmed quickly, proving the existence of discrete atoms that had been hypothesized for centuries. Sadly, Boltzmann may not have known of Einstein's proofs for his work. Later Einstein saw the same fluctuation in radiation, proving his revolutionary hypothesis of light quanta, now called photons. Although this is rarely appreciated, it was Einstein who showed that both matter and energy are discrete, discontinuous particles. His most famous equation shows they are convertible into one another, E = mc2. He also showed that the interaction of matter and radiation, of atoms and photons, always involves ontological chance. This bothered Einstein greatly, because he thought his God should not "play dice."

Late in life, Einstein said that if matter and energy cannot be described with the local continuous analytical functions in space and time needed for his field theories, that all his work would be "castles in the air." But the loss of classical deterministic ideas - which have ossified much of philosophy, crippling philosophical progress - is more than offset by the indeterminism of an open future and Einstein's belief in the "free creation of new ideas."

In the middle twentieth century, Claude Shannon derived the mathematical formula for the communication of information. John von Neumann found it to be identical to Boltzmann's formula for entropy, though with a minus sign (negative entropy). Where Boltzmann entropy is the number of possible microstates, Shannon entropy is the number of possible messages that can be communicated.

Shannon found that new information cannot be created unless there are multiple possible messages. This in turn depends on the ontological chance discovered by Einstein. In a deterministic universe, the total information at all times would be a constant. Information would be a conserved quantity, like matter and energy. "Nothing new under the Sun." But it is not constant, though many philosophers, mathematical physicists, and theologians (God's foreknowledge) still think so. Information is being created constantly in our universe. And we are co-creators of the information, including Einstein's "new ideas."

Because "negative" entropy (order or information) is such a positive quantity, we chose in the 1970's to give it a new name - "Ergo," and to call the four phenomena or processes that create negative entropy "ergodic," for reasons that will become clear. But today, the positive name "information" is all that we need to do information philosophy.

Answering the Fundamental Question of Information Philosophy
How exactly has the universe escaped from the total disorder of thermodynamic equilibrium and produced a world full of information?

It begins with the expansion of the universe. If the universe had not expanded, it would have remained in the original state of thermodynamic equilibrium. We would not be here.

To visualize the departure from equilibrium that made us possible, remember that equilibrium is when particles are distributed evenly in all possible locations in space, and with their velocities distributed by a normal law - the Maxwell-Boltzmann velocity distribution. (The combination of position space and velocity or momentum space is called phase space). When we open the perfume bottle, the molecules now have a much larger phase space to distribute into. There are a much larger number of phase space "cells" in which molecules could be located. It of course takes them time to spread out and come to a new equilibrium state (the Boltzmann "relaxation time.")

When the universe expands, say grows to ten times its volume, it is just like the perfume bottle opening. The matter particles must redistribute themselves to get back to equilibrium. But suppose the universe expansion rate is much faster than the equilibration or relaxation time. The universe is out of equilibrium, and in a flat, ever-expanding, universe it will never get back!

In the earliest moments of the universe, material particles were in equilibrium with radiation at extraordinarily high temperatures. When quarks formed neutrons and protons, they were short-lived, blasted back into quarks by photon collisions. As the universe expanded, the temperature cooled, the space per photon increased and the mean free time between photon collisions increased, giving larger particles a better chance to survive. The expansion red-shifted the photons. decreasing the average energy per photon, and eventually reducing the number of high energy photons that disassociate matter. The mean free path of photons was very short. They were being scattered by collisions with electrons.

When temperature declined further, to 5000 degrees, about 400,000 years after the "Big Bang," the electrons and protons combined to make hydrogen and (with neutrons) helium atoms.

At this time, a major event occurred that we can still see today, the farthest and earliest event visible. When the electrons combined into atoms, the electrons could no longer scatter the photons so easily. The universe became transparent for the photons. Some of those photons are still arriving at the earth today. They are now the red-shifted and cooled down cosmic microwave background radiation. While this radiation is almost perfectly uniform, it shows very small fluctuations that may be caused by random difference in the local density of the original radiation or even in random quantum fluctuations.

These fluctuations mean that there were slight differences in density of the newly formed hydrogen gas clouds. The force of universal gravitation then worked to pull relatively formless matter into spherically symmetric stars and planets. Thus is the original order out of chaos, although this phrase is now most associated with the work on deterministic chaos theory and complexity theory, as we shall see.

How information creation and negative entropy flows appear to violate the second law of thermodynamics
In our open and rapidly expanding universe, the maximum possible entropy (if the particles were "relaxed" into a uniform distribution among the new phase-space cells) is increasing faster than the actual entropy. The difference between maximum possible entropy and the current entropy is called negative entropy. There is an intimate connection between the physical quantity negative entropy and abstract immaterial information, first established by Leo Szilard in 1929.

Two of our "ergodic" phenomena - gravity and quantum cooperative phenomena - pull matter together that was previously separated. Galaxies, stars, and planets form out of inchoate clouds of dust and gas. Gravity binds the matter together. Subatomic particles combine to form atoms. Atoms combine to form molecules. They are held together by quantum mechanics. In all these cases, a new visible information structure appears.

In order for these structures to stay together, the motion (kinetic) energy of their parts must be radiated away. This is why the stars shine. When atoms join to become molecules, they give off photons. The new structure is now in a (negative) bound energy state. It is the radiation that carries away the positive entropy (disorder) needed to balance the new order (information) in the visible structure.

In the cases of chaotic dissipative structures and life, the ergodic phenomena are more complex, but the result is similar, the emergence of visible information. (More commonly it is simply the maintenance of high-information, low-entropy structures.) These cases appear in far-from-equilibrium situations where there is a flow of matter and energy with negative entropy through the information structure. The flow comes in with low entropy but leaves with high entropy. Matter and energy are conserved in the flow, but information in the structure can increase. Remember, information is not a conserved quantity like matter and energy.

Information is neither matter nor energy, though it uses matter when it is embodied and energy when it is communicated. Information is the immaterial arrangement of the matter and energy.

This vision of life as a visible form through which matter and free energy flow was first seen by Ludwig van Bertlanffy in 1939, though it was made more famous by Erwin Schrödinger's landmark essay What Is Life? in 1945, where he claimed that "life feeds on negative entropy."

Both Bertalanffy and Schrödinger knew that the source of negative entropy was our Sun. Neither knew that the ultimate cosmological source of negative entropy is the expansion of the universe, which allowed ergodic gravitation forces to form the Sun. Note that the positive entropy radiation leaving the Sun becomes diluted as it expands, creating a difference between its energy temperature and energy density. This difference is information (negative entropy) that planet Earth uses to generate and maintain biological life.

Note that the 300K (the average earth temperature) photons are dissipated into the dark night sky, on their way to the cosmic microwave background. The Sun-Earth-night sky is a heat engine, with a hot energy source and cold energy sink, that converts the temperature difference not into mechanical energy (work) but into biological energy (life).


When new information is created and embodied in a physical structure, two physical processes must occur.

Our first process is what John von Neumann described as
irreversible Process 1.
The first process is the collapse of a quantum-mechanical wave function into one of the possible states in a superposition of states, which happens in any measurement process. A measurement produces one or more bits of information. Such quantum events involve irreducible indeterminacy and chance, but less often noted is the fact that quantum physics is directly responsible for the extraordinary temporal stability and adequate determinism of most information structures.

We can call the transfer of positive entropy, which stabilizes the new information from Process 1, Process 1b.
The second process is a local decrease in the entropy (which appears to violate the second law of thermodynamics) corresponding to the increase in information. Entropy greater than the information increase must be transferred away from the new information, ultimately to the night sky and the cosmic background, to satisfy the second law.

Given this new stable information, to the extent that the resulting quantum system can be approximately isolated, the system will deterministically evolve according to von Neumann's Process 2, the unitary time evolution described by the Schrödinger equation.

The first two physical processes (1 and 1b) are parts of the information solution to the "problem of measurement," to which must be added the role of the "observer." We shall see that the observer involves a mental Process 3.

The discovery and elucidation of the first two as steps in the cosmic creation process casts light on some classical problems in philosophy and physics , since it is the same two-step process that creates new biological species and explains the freedom and creativity of the human mind.

The cosmic creation process generates the conditions without which there could be nothing of value in the universe, nothing to be known, and no one to do the knowing. Information itself is the ultimate sine qua non.


The Three Kinds of Information Emergence
Note there are three distinct kinds of emergence:
  1. the order out of chaos when the randomly distributed matter in the early universe first gets organized into information structures.

    This was not possible before the first atoms formed about 400,000 years after the Big Bang. Information structures like the stars and galaxies did not exist before about 400 million years. As we saw, gravitation was the principal driver creating information structures.

    Nobel prize winner Ilya Prigogine discovered another ergodic process that he described as the "self-organization" of "dissipative structures." He popularized the slogan "order out of chaos" in an important book. Unfortunately, the "self" in self-organization led to some unrealizable hopes in cognitive psychology. There is no self, in the sense of a person or agent, in these physical phenomena.

    Both gravitation and Prigogine's dissipative systems produce a purely physical/material kind of order. The resulting structures contain information. There is a "steady state" flow of information-rich matter and energy through them. But they do not process information. They have no purpose, no "telos."

    Order out of chaos can explain the emergence of downward causation on their atomic and molecular components. But this is a gross kind of downward causal control. Explaining life and mind as "complex adaptive systems" has not been successful. We need to go beyond "chaos and complexity" theories to teleonomic theories.

  2. the order out of order when the material information structures form self-replicating biological information structures. Some become information processing systems.

    In his famous essay, "What Is Life?," Erwin Schrödinger noted that life "feeds on negative entropy" (or information). He called this "order out of order."

    This kind of biological processing of information first emerged about 3.5 billion years ago on the earth. It continues today on multiple emergent biological levels, e.g., single-cells, multi-cellular systems, organs, etc., each level creating new information structures and information processing systems not reducible to (caused by) lower levels and exerting downward causation on the lower levels.

    And this downward causal control is extremely fine, managing the motions and arrangements of individual atoms and molecules.

    Biological systems are cognitive systems, using internal "subjective" knowledge to recognize and interact with their "objective" external environment, communicating meaningful messages to their internal components and to other individuals of their species with a language of arbitrary symbols, taking actions to maintain themselves and to expand their populations by learning from experience.

    With the emergence of life, "purpose" also entered the universe. It is not the pre-existent "teleology" of many idealistic philosophies (the idea of "essence" before "existence"), but it is the "entelechy" of Aristotle, who saw that living things have within them a purpose, an end, a "telos." To distinguish this evolved telos in living systems from teleology, modern biologists use the term "teleonomy."

  3. the pure information out of order when organisms with minds generate, store (in the brain), replicate, utilize, and then externalize some non-biological information, communicating it to other minds and storing it in the environment. Communication can be by hereditary genetic transmission or by an advanced organism capable of learning and then teaching its contemporaries directly by signaling, by speaking, or indirectly by writing and publishing the knowledge for future generations.

    This kind of information can be highly abstract mind-stuff, pure Platonic ideas, the stock in trade of philosophers. It is neither matter nor energy (though embodied in the material brain), a kind of pure spirit or ghost in the machine. It is a candidate for the immaterial dualist "substance" of René Descartes, though it is probably better thought of as a "property dualism," since information is an immaterial property of all matter.

    The information stored in the mind is not only abstract ideas. It contains a recording of the experiences of the individual. In principle every experience may be recorded, though not all may be reproducible/recallable.

The negative entropy (order, or potential information) generated by the universe expansion is a tiny amount compared to the increase in positive entropy (disorder). Sadly, this is always the case when we try to get "order out of order," as can be seen by studying entropy flows at different levels of emergent phenomena.

In any process, the positive entropy increase is always at least equal to, and generally orders of magnitude larger than, the negative entropy in any created information structures, to satisfy the second law of thermodynamics. The positive entropy is named for Boltzmann, since it was his "H-Theorem" that proved entropy can only increase overall - the second law of thermodynamics. And negative entropy is called Shannon, since his theory of information communication has exactly the same mathematical formula as Boltzmann's famous principle;

S = k log W,

where S is the entropy, k is Boltzmann's constant, and W is the probability of the given state of the system.

Material particles are the first information structures to form in the universe.. They are quarks, baryons, and atomic nuclei, which eventually combine with electrons to form atoms and eventually molecules, when the falling temperature becomes low enough. These material particles are attracted by the force of universal gravitation to form the gigantic information structures of the galaxies, stars, and planets.

Microscopic quantum mechanical particles and huge self-gravitating systems are stable and have extremely long lifetimes, thanks in large part to quantum stability. Stars are another source of radiation, after the original Big Bang cosmic source, which has cooled down to 3 degrees Kelvin (3°K) and shines as the cosmic microwave background radiation.

Our solar radiation has a high color temperature (5000K) and a low energy-content temperature (273K). It is out of equilibrium and it is the source of all the information-generating negative entropy that drives biological evolution on the Earth. Note that the fraction of the light falling on Earth is less than a billionth of that which passes by and is lost in space.

A tiny fraction of the solar energy falling on the earth gets converted into the information structures of plants and animals. Most of it gets converted to heat and is radiated away as waste energy to the night sky.

Every biological structure is a quantum mechanical structure. Quantum cooperative phenomena allow DNA to maintain its stable information structure over billions of years in the constant presence of chaos and noise. And biological structures contain astronomical numbers of particles, allowing them to average over the random noise of individual quantum events, becoming "adequately determined."

The stable information content of a human being survives many changes in the material content of the body during a person’s lifetime. Only with death does the mental information (spirit, soul) dissipate - unless it is saved somewhere.

The total mental information in a living human is orders of magnitude less than the information content and information processing rate of the body. But the cultural information structures created by humans outside the body, in the form of external knowledge like this book, and the enormous collection of human artifacts, now rival the total biological information content.


The Shannon Principle - No Information Without Possibilities
In his development of the mathematical theory of the communication of information, Claude Shannon showed that there can be no new information in a message unless there are multiple possible messages. If only one message is possible, there is no information in that message.

We can simplify this to define the Shannon Principle. No new information can be created in the universe unless there are multiple possibilities, only one of which can become actual.

An alternative statement of the Shannon principle is that in a deterministic system, information is conserved, unchanging with time. Classical mechanics is a conservative system that conserves not only energy and momentum but also conserves the total information. Information is a "constant of the motion" in a determinist world.

Quantum mechanics, by contrast, is indeterministic. It involves irreducible ontological chance.

An isolated quantum system is described by a wave function ψ which evolves - deterministically - according to the unitary time evolution of the linear Schrödinger equation.

(ih/2π) ∂ψ/∂t =

The possibilities of many different outcomes evolve deterministically, but the individual actual outcomes are indeterministic.

This sounds a bit contradictory, but it is not. It is the essence of the highly non-intuitive quantum theory, which combines a deterministic "wave" aspect with an indeterministic "particle" aspect.

In his 1932 Mathematical Foundations of Quantum Mechanics, John von Neumann explained that two fundamentally different processes are going on in quantum mechanics (in a temporal sequence for a given particle - not at the same time).

  1. Process 1. A non-causal process, in which the measured electron winds up randomly in one of the possible physical states (eigenstates) of the measuring apparatus plus electron.

    The probability for each eigenstate is given by the square of the coefficients cn of the expansion of the original system state (wave function ψ) in an infinite set of wave functions φ that represent the eigenfunctions of the measuring apparatus plus electron.

    cn = < φn | ψ >

    This is as close as we get to a description of the motion of the "particle" aspect of a quantum system. According to von Neumann, the particle simply shows up somewhere as a result of a measurement.

    Information physics says that the particle shows up whenever a new stable information structure is created, information that can be observed.

    Process 1b. The information created in Von Neumann's Process 1 will only be stable if an amount of positive entropy greater than the negative entropy in the new information structure is transported away, in order to satisfy the second law of thermodynamics.

  2. Process 2. A causal process, in which the electron wave function ψ evolves deterministically according to Schrödinger's equation of motion for the "wave"aspect. This evolution describes the motion of the probability amplitude wave ψ between measurements. The wave function exhibits interference effects. But interference is destroyed if the particle has a definite position or momentum. The particle path itself can never be observed.

Von Neumann claimed there is another major difference between these two processes. Process 1 is thermodynamically irreversible. Process 2 is in principle reversible. This confirms the fundamental connection between quantum mechanics and thermodynamics that is explainable by information physics.

Information physics establishes that process 1 may create information. It is always involved when information is created.

Process 2 is deterministic and information preserving.

The first of these processes has come to be called the collapse of the wave function.

It gave rise to the so-called problem of measurement, because its randomness prevents it from being a part of the deterministic mathematics of process 2.

But isolation is an ideal that can only be approximately realized. Because the Schrödinger equation is linear, a wave function | ψ > can be a linear combination (a superposition) of another set of wave functions | φn >,

| ψ > = cn | φn >,

where the cn coefficients squared are the probabilities of finding the system in the possible state | φn > as the result of an interaction with another quantum system.

cn2 = < ψ | φn >2.

Quantum mechanics introduces real possibilities, each with a calculable probability of becoming an actuality, as a consequence of one quantum system interacting (for example colliding) with another quantum system.

It is quantum interactions that lead to new information in the universe - both new information structures and information processing systems. But that new information cannot subsist unless a compensating amount of entropy is transferred away from the new information.

Even more important, it is only in cases where information persists long enough for a human being to observe it that we can properly describe the observation as a "measurement" and the human being as an "observer." So, following von Neumann's "process" terminology, we can complete his admittedly unsuccessful attempt at a theory of the measuring process by adding an anthropomorphic

Process 3 - a conscious observer recording new information in a mind. This is only possible if the local reductions in the entropy (the first in the measurement apparatus, the second in the mind) are both balanced by even greater increases in positive entropy that must be transported away from the apparatus and the mind, so the overall change in entropy can satisfy the second law of thermodynamics.
An Information Interpretation of Quantum Mechanics
Our emphasis on the importance of information suggests an "information interpretation" of quantum mechanics that eliminates the need for a conscious observer as in the "standard orthodox" Copenhagen Interpretation. An information interpretation dispenses also with the need for a separate "classical" measuring apparatus.

There is only one world, the quantum world.
We can say it is ontologically indeterministic, but epistemically deterministic, because of human ignorance
Information physics claims there is only one world, the quantum world, and the "quantum to classical transition" occurs for any large macroscopic object with mass m that contains a large number of atoms. In this case, independent quantum events are "averaged over," the uncertainty in position and momentum of the object becomes less than the observational accuracy as
Δv Δx > h / m and as h / m goes to zero.

The classical laws of motion, with their implicit determinism and strict causality emerge when microscopic events can be ignored.

Information philosophy interprets the wave function ψ as a "possibilities" function. With this simple change in terminology, the mysterious process of a wave function "collapsing" becomes a much more intuitive discussion of possibilities, with mathematically calculable probabilities, turning into a single actuality, faster than the speed of light.

Information physics is standard quantum physics. It accepts the Schrödinger equation of motion, the principle of superposition, the axiom of measurement (now including the actual information "bits" measured), and - most important - the projection postulate of standard quantum mechanics (the "collapse" so many interpretations deny).

But a conscious observer is not required for a projection, for the wave-function "collapse", for one of the possibilities to become an actuality. What it does require is an interaction between (quantum) systems that creates irreversible information.


In less than two decades of the mid-twentieth century, the word information was transformed from a synonym for knowledge into a mathematical, physical, and biological quantity that can be measured and studied scientifically.

In 1929, Leo Szilard connected an increase in thermodynamic (Boltzmann) entropy with any increase in information that results from a measurement, solving the problem of "Maxwell's Demon," a thought experiment suggested by James Clerk Maxwell, in which a local reduction in entropy is possible when an intelligent being interacts with a thermodynamic system.

In the early 1940s, digital computers were invented by von Neumann, Shannon, Alan Turing, and others. Their machines could run a stored program to manipulate stored data, processing information, as biological organisms had been doing for billions of years.

Then in the late 1940s, the problem of communicating digital data signals in the presence of noise was first explored by Shannon, who developed the modern mathematical theory of the communication of information. Norbert Wiener wrote in his 1948 book Cybernetics that "information is the negative of the quantity usually defined as entropy," and in 1949 Leon Brillouin coined the term "negentropy."

Finally, in the early 1950s, inheritable characteristics were shown by Francis Crick, James Watson, and George Gamow to be transmitted from generation to generation in a digital code.


Information is Immaterial
Information is neither matter nor energy, but it needs matter for its embodiment and energy for its communication.

A living being is a form through which passes a flow of matter and energy (with low or negative entropy). Genetic information is used to build the information-rich matter into an information-processing structure that contains a very large number of hierarchically organized information structures.

All biological systems are cognitive, using their internal information structure to guide their actions. Even some of the simplest organisms may learn from experience. The most primitive minds are experience recorders and reproducers.

In humans, the information-processing structures create new actionable information (knowledge) by consciously and unconsciously reworking and reusing the experiences stored in the mind.

Emergent higher levels exert downward causation on the contents of the lower levels, ultimately supporting mental causation and free will.

When a ribosome assembles 330 amino acids in four symmetric polypeptide chains (globins), each globin traps an iron atom in a heme group at the center to form the hemoglobin protein. This is downward causal control of the amino acids, the heme groups, and the iron atoms by the ribosome. The ribosome is an example of Erwin Schrödinger's emergent "order out of order," life "feeding on the negative entropy" of digested food.

Notice the absurdity of the idea that the random motions of the transfer RNA molecules (green in the video above), each holding a single amino acid (red), are carrying pre-determined information of where they belong in the protein being built.

Determinism is an emergent property and an ideal philosophical concept, unrealizable except approximately in the kind of adequate determinism that we experience in the macroscopic world, where the determining information is part of the higher-level control system.

The total information in multi-cellular living beings can develop to be many orders of magnitude more than the information present in the original cell. The creation of this new information would be impossible for a deterministic universe, in which information is constant.

Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains.

Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history.

And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.

Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.

Since the 1950's, the science of human behavior has changed dramatically from a "black box" model of a mind that started out as a "blank slate" conditioned by environmental stimuli. Today's mind model contains many "functions" implemented with stored programs, all of them information structures in the brain. The new "computational model" of cognitive science likens the brain to a computer, with some programs and data inherited and others developed as appropriate reactions to experience.

The Experience Recorder and Reproducer
The brain is not a digital computer doing symbolic logic, with one or more central processing units addressing multiple data storage systems. It is more like a multi-channel and multi-track experience recorder and reproducer with an extremely high data rate. Information about an experience - the sights, sounds, smells, touch, and taste - is recorded along with the emotions - feelings of pleasure, pain, hopes, and fears - that accompany the experience. When confronted with similar experiences later, the brain can reproduce or re-present information about the original experience (an instant replay) that helps to guide current actions.

The ERR model stands in contrast to the popular cognitive science or “computational” model of a mind as a digital computer. No algorithms, data addressing schemes, or stored programs are needed for the ERR model.

The physical metaphor is a non-linear random-access data recorder, where data is stored using content-addressable memory (the memory address is the data content itself). Simpler than a computer with stored algorithms, a better technological metaphor might be a video and sound recorder, enhanced with the ability to record - and replay - smells, tastes, touches, and critically essential, feelings.

The biological model is neurons that wire together during an organism’s experiences, in multiple sensory and limbic systems, such that later firing of even a part of the wired neurons can stimulate firing of all or part of the original complex.

A conscious being is constantly recording information about its perceptions of the external world, and most importantly for ERR, it is simultaneously recording its feelings. Sensory data such as sights, sounds, smells, tastes, and tactile sensations are recorded in a sequence along with pleasure and pain states, fear and comfort levels, etc.

All these experiential and emotional data are recorded in association with one another. This means that when the experiences are reproduced (played back in a temporal sequence), the accompanying emotions are once again felt, in synchronization.

The ability to reproduce an experience is critical to learning from past experiences, so as to make them guides for action in future experiences. The ERR model is the minimal mind model that provides for such learning by living organisms.

The ERR model does not need computer search, retrieval, and decision algorithms to reproduce past experiences. All that is required is that relevant past experiences “play back” whenever they are stimulated by present experiences that resemble the past experiences in one or more ways.

All or most of these relevant past experiences appear before the mind as alternative possibilities for evaluation as thoughts and actions. Decisions can be made based on the relative values of past outcomes.

Neuroscientist Donald Hebb's insight that "neurons that fire together wire together" is widely accepted today. The ERR model of information philosophy is built on the simple consequence of Hebb's work that "neurons that have been wired together will fire together."

Neuroscientists and philosophers of mind have long asked how diverse signals from multiple locations in the brain over multiple pathways appear so unified in the brain. The ERR model offers a simple solution to this “binding” problem. Experiences are bound at their initial recording. They do not have to be re-associated by some central processing unit looking up where experiences may have been distributed among the various memory or sensory motor areas of the brain.

The ERR model may also throw some light on the problem of "qualia" and of "what it's like to be" a particular organism.

Information Philosophy and Modern Philosophy
Modern philosophy is a story about the discovery of timeless truths, laws of nature, a block universe in which the future is a logical and physical extension of the past. A primordial moment of creation is assumed to start a causal chain in which the entire future can be foreknown by an omniscient being.

Modern philosophy seeks knowledge in logical reasoning with clear and unchanging concepts. Its guiding lights are thinkers like Parmenides, Plato, and Kant, who sought unity and identity, being and universals.

Tradition, Modern, and Postmodern In a traditional society, authoritative knowledge is that which has been handed down. Moderns are those who think that all knowledge must be based on reason. Postmoderns recognize that much knowledge has been invented, arbitrarily created
In modern philosophy, the total amount of information in the conceptually closed universe is static, a physical constant of nature. The laws of nature allow no exceptions, they are perfectly causal. Everything that happens is said to have a physical cause. This is called "causal closure".   Chance and change - in a deep philosophical sense - are said to be illusions. If every event has a predetermined cause, or reason, even free will is an illusion.

Information philosophy, by contrast, is a story about invention, about novelty, about biological emergence and new beginnings unseen and unseeable beforehand, a past that is fixed but an ambiguous future that can be shaped by teleonomic changes in the present.

Its model thinkers are Heraclitus, Protagoras, Aristotle, and Hegel, for whom time, place, and particular situations mattered.

Information philosophy is built on probabilistic laws of nature. The fundamental challenge for information philosophy is to explain the emergence of stable information structures from primordial and ever-present chaos, to account for the phenomenal success of deterministic laws when the material substrate of the universe is irreducibly chaotic, noisy, and random, and to understand the concepts of truth, necessity, and certainty in a universe of chance, contingency, and indeterminacy.

Determinism and the exceptionless causal and deterministic laws of classical physics are the real illusions. Determinism is information-preserving. In an ideal deterministic Laplacian universe, the present state of the universe is implicitly contained in its earliest moments. There is "nothing new under the sun."

This ideal determinism does not exist. The "adequate determinism" behind the laws of nature emerged from the early years of the universe when there was only the indeterministic chaos of "thermodynamic equilibrium" and its maximal entropy or disorder.

In a random noisy environment, how can anything be regular and appear determined? It is because the macroscopic consequences of the law of large numbers average out microscopic quantum fluctuations to provide us with a very adequate determinism for large objects.

Information Philosophy is an account of continuous information creation, a story about the origin and evolution of the universe, of life, and of intelligence from an original quantal chaos that is still present in the microcosmos. More than anything else, it is the creation and maintenance of stable information structures, despite the destructive entropic requirements of the second law of thermodynamics. Creation of living information structures distinguishes biology from physics and chemistry.

Living things store useful information in a memory of the past that they can use to shape the future. The "meaning" in the information is their use of it. Some get their information "built-in" via heredity. Some learn it from experience. Others invent it!

Ancient Philosophy, before the advent of Modern Theology with John Duns Scotus and Thomas Aquinas, and Medieval Philosophy, before the beginning of Modern Philosophy with René Descartes, covered the same wide range of questions now addressable by Information Philosophy.

The Development of Information Philosophy
Our earliest work on information philosophy dates from the 1950's, based on suggestions made thirty years earlier by Arthur Stanley Eddington. In his 1928 Nature of the Physical World, Eddington argued that quantum indeterminacy had "opened the door of human freedom," and that the second law of thermodynamics might have some bearing on the question of objective good.

In the 1950's, we studied the then leading philosophies of positivism and existentialism.

Bertrand Russell, with the help of G. E. Moore, Alfred North Whitehead, and Ludwig Wittgenstein, proposed logic and language as the proper foundational basis, not only of philosophy, but also of mathematics and science. Wittgenstein's Tractatus imagined that a set of all true propositions could capture all the knowledge of modern science.

4.11 The totality of true propositions is the whole of natural science
        (or the whole corpus of the natural sciences)

Their logical positivism and the variation called logical empiricism developed by Rudolf Carnap and the Vienna Circle proved to be failures in grounding philosophy, mathematics, or science.

On the continent, existentialism was the rage. We read Friedrich Nietzsche, Martin Heidegger, and Jean-Paul Sartre.

The existentialist continentals argued that freedom exists, but there are no objective values. The utilitarian English argued that values exist, but human freedom does not.

We wrote that "Values without freedom are useless. Freedom without values is absurd."

This was a chiasmos like the great figure of Immanuel Kant, rephrased by Charles Sanders Peirce as "Idealism without Materialism is Empty. Materialism without Idealism is Blind."

In the 1960's, we formulated arguments that cited "pockets of low entropy," in apparent violation of the second law, as the possible basis for anything with objective value. We puzzled over the origin of "negative entropy," since the universe was believed to have started in thermodynamic equilibrium and the second law of thermodynamics says that (positive) entropy can only increase.

In the late 1960's, we developed a two-stage model of free will and called it Cogito, a term often associated with the mind and with thought. With deference to Descartes, the first modern philosopher, we called "negative entropy" Ergo. While thermodynamics calls it "negative," information philosophy sees it as the ultimate "positive" and deserving of a better name. We thought that Ergo etymologically suggests a fundamental kind of energy ("erg" zero), e.g., the "Gibbs free energy," G0, that is available to do work because it has low entropy.

In the early 70's, we decided to call the sum of human knowledge the Sum, to complete the triple wordplay on Descartes' proof of his existence.

We saw a great battle going on in the universe - between originary chaos and emergent cosmos. The struggle is between destructive chaotic processes that drive a microscopic underworld of random events versus constructive cosmic processes that create information structures with extraordinary emergent properties that include adequately determined scientific laws - despite, and in many cases making use of, the microscopic chaos.

Since the destructive chaos is entropic, we repurposed a term from statistical mechanics and called the anti-entropic processes creating information structures ergodic. The embedded Ergod resonated.

Created information structures range from galaxies, stars, and planets, to molecules, atoms, and subatomic particles. They are the structures of terrestrial life from viruses and bacteria to sentient and intelligent beings. And they are the constructed ideal world of thought, of intellect, of spirit, including the laws of nature, in which we humans play a role as co-creator.

Information is constant in a deterministic universe. There is "nothing new under the sun." The creation of new information is not possible without the random chance and uncertainty of quantum mechanics, plus the extraordinary temporal stability of quantum mechanical structures.

It is of the deepest philosophical significance that information is based on the mathematics of probability. If all outcomes were certain, there would be no "surprises" in the universe. Information would be conserved and a universal constant, as some mathematicians and physicists mistakenly believe. Information philosophy requires the ontological uncertainty and probabilistic outcomes of modern quantum physics to produce new information.

But at the same time, without the extraordinary stability of quantized information structures over cosmological time scales, life and the universe we know would not be possible. That stability is the consequence of an underlying digital nature. Quantum mechanics reveals the architecture of the universe to be discrete rather than continuous, to be digital rather than analog. Digital information transfers are essentially perfect. All analog transfers are "lossy."

Moreover, the "correspondence principle" of quantum mechanics and the "law of large numbers" of statistics ensures that macroscopic objects can normally average out microscopic uncertainties and probabilities to provide the "adequate determinism" that shows up in all our "Laws of Nature."

Information philosophy explores some classical problems in philosophy with deeper and more fundamental insights than is possible with the logic and language approach of modern analytic philosophy.

By exploring the origins and evolution of structure in the universe, information philosophy transcends humanity and even life itself, though it is not a mystical metaphysical transcendence.

Information philosophy uncovers the creative process working in the universe
to which we owe our existence, and therefore perhaps our reverence for its "providence".

Information philosophy locates the fundamental source of all values not in humanity ("man the measure"), not in bioethics ("life the ultimate good"), but in the origin and evolution of information in the cosmos.

Information philosophy is an idealistic philosophy, a process philosophy, and a systematic philosophy, the first in many decades. It provides important new insights into the Kantian transcendental problems of epistemology, ethics, freedom of the will, god, and immortality, as well as the mind-body problem, consciousness, and the problem of evil.

In physics, information philosophy (or information physics) provides new insights into the problem of measurement, the paradox of Schrödinger's Cat, the two paradoxes of microscopic reversibility and macroscopic recurrence that Josef Loschmidt and Ernst Zermelo used to criticize Ludwig Boltzmann's explanation of the entropy increase required by the second law of thermodynamics, and finally information provides a better understanding of the entanglement and nonlocality phenomena that are the basis for modern quantum cryptography and quantum computing. Recently, we are developing an explanation for the puzzling phenomenon of entanglement.

Finally, a new philosophy of biology should be based on the deep understanding of organisms as information users, information creators, information communicators, and at the higher levels, information processors, including humans who have learned to store information externally and transfer it between the generations culturally. Except for organisms that can extract information by photosynthesis of the negative entropy (free or available energy) streaming from the sun, most living things destroy other cells to extract the information needed to maintain their own low entropy state of organization. Most life feeds on other life.

And most life communicates with other life. Even single cells, before the emergence of multicellular organisms, developed communication systems between the cells that are still visible in slime molds and social amoebae today. In a multicellular organism, every cell has some level of communication with all the others. Most higher level organisms share communal information that makes them stronger as a social group than as independent individuals. The sum of human knowledge has amplified the power of humanity, for better or worse, to a level that can control the environmental conditions on all of planet Earth.

Information biology is the hypothesis that all biological evolution should be viewed primarily as the development of more and more powerful users, creators, and communicators of information. Seen though the lens of information, humans are the current end product of information processing systems. With the emergence of life and mind, purpose (telos) appeared in the universe. The teleonomic goal of each cell is to become two cells, which replicates its information content. The purpose of each species is to improve its reproductive success relative to other populations. The purpose of human populations then is to use, to add to, and to communicate human knowledge in order to maximize the human capital per person.

Like love, the information that is shared by educating others is not used up. Information is not a scarce economic good. The more that information is communicated, the more of it there is, in human minds (not brains), and in the external stores of human knowledge. These include books of course, but in the future they will be the interconnected knowledge bases of the world wide web, including www.informationphilosopher.com, since books are expensive and inaccessible for many.

The first thing we must do for the young is to teach them how to teach themselves by accessing these knowledge systems with handheld devices that will some day be available for all the world's children, beyond one laptop per child to one smartphone per child.


Based on insights into the discovery of the cosmic creation process, the Information Philosopher proposes three primary ideas that are new approaches to perennial problems in philosophy. They are likely to change some well-established philosophical positions. Even more important, they may reconcile idealism and materialism and provide a new view of how humanity fits into the universe.

The three ideas are

  • An explanation or epistemological model of knowledge formation and communication. Knowledge and information are neither matter nor energy, but they require matter for expression and energy for communication. They seem to be metaphysical.
    Briefly, we identify knowledge with actionable information in the brain-mind. We justify knowledge by behavioral studies that demonstrate the existence of information structures implementing functions in the brain. And we verify knowledge scientifically.

  • A basis for objective value, a metaethics beyond humanism and bioethics, grounded in the fundamental information creation processes behind the structure and evolution of the universe and the emergence of life.
    Briefly, we find positive value (or good) in information structures. We see negative value (or evil) in disorder and entropy tearing down such structures. We call energy with low entropy "Ergo" and call anti-entropic processes "ergodic." We recognize that "ergodic" is itself too esoteric and thus not likely to be widely accepted. Perhaps the most positive term for what we value is just "information" itself!
    Our first categorical imperative is then "act in such a way as to create, maintain, and preserve information as much as possible against destructive entropic processes."

    Our second ethical imperative is "share knowledge/information to the maximum extent." Like love, our own information is not diminished when we share it with others

    Our third moral imperative is "educate (share the knowledge of what is right) rather than punish." Knowledge is virtue. Punishment wastes human capital and provokes revenge.

  • Watch a 10-minute animated tutorial on the Two-Stage Solution to
    the Free Will Problem
    A scientific model for free will and creativity informed by the complementary roles of microscopic randomness and adequate macroscopic determinism in a temporal sequence that generates new information.
    Briefly, we separate "free" and "will" in a two-stage process - first the free generation of alternative possibilities for action (which creates new information), then an adequately determined decision by the will. We call this two-stage view our Cogito model and trace the idea of a two-stage model in the work of two dozen thinkers back to William James in 1884.

    This model is a synthesis of adequate determinism and limited indeterminism, a coherent and complete compatibilism that reconciles
    free will with both determinism and indeterminism.

    David Hume thought he had reconciled freedom with determinism. We reconcile free will with indeterminism and an "adequate" determinism.

    Because it makes free will compatible with both a form of determinism (really determination) and with an indeterminism that is limited and controlled by the mind, the leading libertarian philosopher Bob Kane suggested we call this model "Comprehensive Compatibilism."

    The problem of free will cannot be solved by logic, language, or even by physics. Man is not a machine and the mind is not a computer.
    Free will is a property of a biophysical information processing system.

All three ideas depend on understanding modern cosmology, physics, biology, and neuroscience, but especially the intimate connection between quantum mechanics and the second law of thermodynamics that allows for the creation of new information structures.

All three are based on the theory of information, which alone can establish the existential status of ideas, not just the ideas of knowledge, value, and freedom, but other-worldly speculations in natural religion like God and immortality.

All three have been anticipated by earlier thinkers, but can now be defended on strong empirical grounds. Our goal is less to innovate than to reach the best possible consensus among philosophers living and dead, an intersubjective agreement between philosophers that is the surest sign of a knowledge advance.

This Information Philosopher website aims to be an open resource for the best thinking of philosophers and scientists on these three key ideas and a number of lesser ideas that remain challenging problems in philosophy - on which information philosophy can shed some light.

Among these are the mind-body problem (the mind can be seen as the realm of information in its free thoughts, the body an adequately determined biological system creating and maintaining information); the common sense intuition of a cosmic creative process often anthropomorphized as a God or divine Providence; the problem of evil (chaotic entropic forces are the devil incarnate); and the "hard problem" of consciousness (agents responding to their environment, and originating new causal chains, based on information processing).

Philosophy is the love of knowledge or wisdom. Information philosophy (I-Phi or ΙΦ) qualifies and quantifies knowledge as meaningful actionable information. Information philosophy reifies information as an immaterial entity that has causal power over the material world!

What is information that merits its use as the foundation of a new method of inquiry?

Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and available usable energy for its communication. Information is the modern spirit, the ghost in the machine. It is the stuff of thought, the immaterial substance of philosophy.

Information is a powerful diagnostic tool. It is a better abstract basis for philosophy, and for science as well, especially physics, biology, and neuroscience. It is capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), and idealism itself.

Information philosophy is now more than the solution to three fundamental problems we identified in the 1960's and '70's. I-Phi is a new philosophical method, capable of solving multiple problems in both philosophy and physics. It needs young practitioners, presently tackling some problem, who might investigate the problem using this new methodology.

Note that, just as the philosophy of language is not linguistic philosophy, Information philosophy is not the philosophy of information, which is mostly about computers and cognitive science, the computational theory of mind.

Philosophers like Ludwig Wittgenstein labeled many of our problems “philosophical puzzles.” Bertrand Russell called them “pseudo-problems.” Analytic language philosophers thought many of these problems could be “dis-solved,” revealing them to be conceptual errors caused by the misuse of language.

Information philosophy takes us past logical puzzles and language games, not by diminishing philosophy and replacing it with science.

Russell insisted that

“questions which are already capable of definite answers are placed in the sciences, while those only to which, at present, no definite answer can be given, remain to form the residue which is called philosophy.”
Information philosophy aims to show that problems in philosophy should not be reduced to “Russell’s Residue.”

The language philosophers of the twentieth century thought that they could solve (or at least dis-solve) the classical problems of philosophy. They did not succeed. Information philosophy, by comparison, now has cast a great deal of light on some of those problems. It needs more information philosophers to join us to make more progress.


To recap, when information is stored in any structure, two fundamental physical processes occur. First is a "collapse" of a quantum mechanical wave function, reducing multiple possibilities to a single actuality. Second is a local decrease in the entropy corresponding to the increase in information. Entropy greater than that must be transferred away from the new information structure to satisfy the second law of thermodynamics.

These quantum level processes are susceptible to noise. Information stored may have errors. When information is retrieved, it is again susceptible to noise. This may garble the information content. In information science, noise is generally the enemy of information. But some noise is the friend of freedom, since it is the source of novelty, of creativity and invention, and of variation in the biological gene pool.

Biological systems have maintained and increased their invariant information content over billions of generations, coming as close to immortality as living things can. Philosophers and scientists have increased our knowledge of the external world, despite logical, mathematical, and physical uncertainty. They have created and externalized information (knowledge) that can in principle become immortal. Both life and mind create information in the face of noise. Both do it with sophisticated error detection and correction schemes. The scheme we use to correct human knowledge is science, a two-stage combination of freely invented theories and adequately determined experiments. Information philosophy follows that example.


If you have read this far, you already know that the Information Philosopher website itself is an exercise in information sharing. It has ten parts, each with multiple chapters, that include nearly 3,000 web pages.

Teacher and Scholar links display additional material on some pages, and reveal hidden footnotes on some pages. The footnotes themselves are often in the Scholar section.

Our goal is for the website to contain all the great philosophical discussions of the three original problem areas we identified in the 1970's - COGITO (freedom), ERGO (value), and SUM (knowledge) - plus potential solutions for several classic problems in philosophy and physics, many of which had been designated "pseudo-problems" or relegated to "metaphysics."

We have now shown that information philosophy is a powerful diagnostic tool for addressing metaphysical problems. See The Metaphysicist.

In the left-hand column of all I-Phi pages are links to over five hundred philosophers and scientists who have made contributions to these great problems. Their I-Phi web pages may include original contributions of each thinker, with examples of their thought, usually in their own words rather than a paraphrase, and where possible in their original languages.

All original content on Information Philosopher is available for your use, without requesting
permission, under a Creative Commons Attribution License.     cc by

Copyrights for all excerpted and quoted works remain with their authors and publishers.


Introduction Knowledge Value Freedom Mind Chance Quantum Afterword

For Teachers
A web page may contain two extra levels of material. The Normal page is material for newcomers and students of the Information Philosophy. Two hidden levels contain material for teachers (e.g., secondary sources) and for scholars (e.g., footnotes, and original language quotations).
Teacher materials on a page will typically include references to secondary sources and more extended explanations of the concepts and arguments. Secondary sources will include books, articles, and online resources. Extended explanations should be more suitable for teaching others about the core philosophical ideas, as seen from an information perspective.


For Scholars
Scholarly materials will generally include more primary sources, more in-depth technical and scientific discussions where appropriate, original language versions of quotations, and references to all sources.

Footnotes for a page appear in the Scholar materials. The footnote indicators themselves are only visible in Scholar mode.

Normal | Teacher | Scholar