The Freedom section of Information Philosopher is now a book. Click here for info
What is information? How is it created? Why is it a better tool for examining philosophical problems than traditional logic or linguistic analysis? Has information philosophy actually solved any problems?
What is information?
The simple definition of information is the act of informing - the communication of knowledge from a sender to a receiver that informs (literally shapes) the receiver.
A message that is certain to tell you something you already know contains no new information.
If everything that happens was certain to happen, as determinist philosophers claim, no new information would ever enter the universe. Information would be a universal constant. There would be "nothing new under the sun." Every past and future event can in principle be known by a super-intelligence with access to such a fixed totality of information (Laplace's Demon).
The total amount of mass and energy in the universe is a constant. A fundamental law of nature is the conservation of mass and energy.
But information is neither matter nor energy, though it needs matter to be embodied and energy to be communicated. Information can be created and destroyed. It is the modern spirit, the ghost in the machine, the mind in the body. It is the soul, and when we die, it is our information that perishes. The matter remains. Information is a potential objective value, the ultimate sine qua non.
How is information created?
Ex nihilo, nihil fit, said the ancients, Nothing comes from nothing. But information is no (material) thing. Information is physical, but it is not material. Information is a property of material. It is the form that matter can take. We can thus create something (immaterial) from nothing!
But we shall find that it takes a special kind of energy (free or available energy, with negative entropy) to do so.
Information is not a universal constant. Matter and energy are conserved quantities in physics, but information is not conserved. We know that information is being created because the universe began some thirteen billion years ago in a state of minimal information. The "Big Bang" was formless radiation, pure energy, no material particles. How matter formed into information structures like the galaxies, stars, and planets is the beginning of a story that will end with understanding how human minds emerged to understand our place in the astrophysical universe.
We identify three fundamental processes of information creation - the purely material, the biological, and the mental. The first was the "order out of chaos" when matter formed from radiation and the expansion of the early universe led to the gravitational attraction of randomly distributed matter into highly organized galaxies, stars, and planets. The expansion - the increased space between material objects - drives the universe away from thermodynamic equilibrium (maximum entropy) and creates negative entropy, a quantitative measure of the order that is the basis for all information.
A second kind of information creation was when the first molecule on earth replicated itself and went on to duplicate its information exponentially. Accidental errors in the duplication provided variations in reproductive success. Most important, besides creating information structures, biological systems are also information processors. Living things use information to guide their actions.
The third process of information creation, and the most important to philosophy, is human creativity. Almost every philosopher since philosophy began has considered the mind as something distinct from the body. We can now explain that distinction. The mind is the immaterial information in the brain. The brain, part of the material body, is a biological information processor. As some philosophers have speculated, the mind is software in the brain hardware.
The most important information created in a mind is a recording of an individual's experiences (sensations). Recordings are played back (automatically and perhaps mostly unconsciously) as a guide to evaluate future actions (volitions) in similar situations. The particular past experiences reproduced are those stored in the brain located near elements of the current experience (association of ideas). Just as neurons that fire together wire together, neurons that have been wired together will later fire together.
Sensations are recorded as the mental effects of physical causes.
Sensations are stored as retrievable information in the mind of an individual self. Recordings include not only the five afferent senses but also the internal emotions - feelings of pleasure, pain, hopes, and fears - that accompany an experience. They constitute "what it's like" for a particular being to have an experience.
Volitions are the mental causes of physical effects. Volitions begin with 1) the reproduction of past experiences that are similar to the current experience. These become thoughts about possible actions and the (partly random) generation of other alternative possibilities for action. They continue with 2) the evaluation of those freely generated thoughts followed by a willful selection (sometimes habitual) of one of those actions.
Volitions are followed by 3) new sensations coming back to the mind indicating that the self has caused the action to happen (or not). This feedback is recorded as further retrievable information, reinforcing the knowledge stored in the mind that the individual self can cause this kind of action (or sometimes not).
Why is information better than logic and language for solving philosophical problems?
The theory of communication of information is the foundation of our "information age." To understand how we know things is to understand how knowledge represents the material world of embodied "information structures" in the mental world of immaterial ideas.
All knowledge starts with the recording of experiences. The experiences of thinking, perceiving, knowing, feeling, desiring, deciding, and acting may be bracketed by philosophers as "mental" phenomena, but they are no less real than other "physical" phenomena. They are themselves physical phenomena. They are just not material things.
All science begins with information gathered from experimental observations, which are mental phenomena. So all knowledge of the physical world rests on the mental. All scientific knowledge is shared information and as such science is immaterial and mental, some might say fundamental. Recall Descartes' argument that the experience of thinking is that which for him is the most certain.
The analysis of language, particularly the analysis of philosophical concepts, which dominated philosophy in the twentieth century, has failed to solve the most ancient philosophical problems. At best, it claims to "dis-solve" some of them as conceptual puzzles. The "problem of knowledge" itself, traditionally framed as "justifying true belief," is recast by informaton philosophy as the degree of isomorphism between the information in the physical world and the information in our minds. Psychology can be defined as the study of this isomorphism.
We shall see how information processes in the natural world use arbitrary symbols (e.g., nucleotide sequences) to refer to something, to communicate messages about it, and to give the symbol meaning in the form of instructions for another process to do something (e.g., create a protein). These examples provide support for both theories of meaning as reference and meaning as use.
Note that just as language philosophy is not the philosophy of language, so information philosophy is not the philosophy of information. It is rather the use of information as a tool to study philosophical problems, some of which are today yielding tentative solutions. It is time for philosophy to move beyond logical puzzles and language games.
What problems has information philosophy solved?
Why has philosophy made so little progress? Is it because philosophers prefer problems, while scientists seek solutions? Must a philosophical problem solved become science and leave philosophy? The information philosopher thinks not.
But in order to remain philosophy, interested philosophers must themselves examine the proposed information-based solutions and consider them as part of the critical philosophical dialogue.
The full story of cosmic, biological, and mental information creation involves learning some basic physics, particularly quantum mechanics and thermodynamics, along with some information theory. The information philosopher website provides animated visualizations of the most basic concepts that you will need to become an information philosopher.
When you are ready to consider them, among the proposed solutions are:
It turns out that the methodology of information philosophy can be productively applied to some outstanding problems in physics. Philosophers of science might take an interest in the proposed information-based solutions to these problems.
Our fundamental philosophical question is cosmological and ultimately metaphysical.
What are the processes that create emergentinformation structures in the universe?
Given the second law of thermodynamics, which says that any system will over time approach a thermodynamic equilibrium of maximum disorder or entropy, in which all information is lost, and given the best current model for the origin of the universe, which says everything began in a state of thermodynamic equilibrium some 13.75 billion years ago, how can it be that living beings are creating and communicating vast amounts of new information every day?
Why are we not still in that original state of equilibrium?
Broadly speaking, there are four major phenomena or processes that can reduce the entropy locally, while of course increasing it globally to satisfy the second law of thermodynamics. Three of these do it "blindly," the fourth does it with a built-in "purpose," or telos."
Universal Gravitation
Quantum Cooperative Phenomena (e.g., crystallization, the formation of atoms and molecules)
"Dissipative" Chaos (Non-linear Dynamics)
Life
None of these processes can work unless they have a way to get rid of the positive entropy (disorder) and leave behind a pocket of negative entropy (order or information). The positive entropy is either conducted, convected, or radiated away as waste matter and energy, as heat, or as pure radiation. At the quantum level, it is always the result of interactions between matter and radiation (photons). Whenever photons interact with material particles, the outcomes are inherently unpredictable. As Albert Einstein discovered ten years before the founding of quantum mechanics, these interactions involve irreducible ontological chance.
Negative entropy is an abstract thermodynamic concept that describes energy with the ability to do work, to make something happen. This kind of energy is often called free energy or available energy. In a maximally disordered state (called thermodynamic equilibrium) there can be matter in motion, the motion we call heat. But the average properties - density, pressure, temperature - are the same everywhere. Equilibrium is formless. Departures from equilibrium are when the physical situation shows differences from place to place. These differences are information.
The second law of thermodynamics is then simply that isolated systems will eliminate differences from place to place until the various properties are uniform. Natural processes spontaneously destroy information. Consider the classic case of what happens when we open a perfume bottle.
The perfume molecules dissipate until they are uniformly distributed. Statistical physics mistakenly claims that if the velocities of all the particles were reversed at an instant, the molecules would return to the bottle. It assumes that the complete path information needed to return to the bottle is preserved. But information is not conserved. It can be created and it can be destroyed. We shall show why such microscopic reversibility is extremely unlikely.
Ludwig Boltzmann derived a mathematical formula for entropy as a summation of the probabilities of finding a system in all the possible states of a system. When every state is equally probable, entropy is at a maximum, and no differences (information) are visible. The formula for negative entropy is just the maximum possible entropy minus the actual entropy (when there are differences from place to place).
Claude Shannon derived the mathematical formula for information and found it to be identical to the formula for negative entropy - a summation of the probabilities of all the possible messages that can be communicated.
Because "negative" entropy (order or information) is such a positive quantity, we chose many years ago to give it a new name - "Ergo," and to call the four phenomena or processes that create it "ergodic," for reasons that will become clear. But today, the positive name "information" is all that we need to do philosophical work.
How exactly has the universe escaped from the total disorder of thermodynamic equilibrium and produced a world full of information?
It begins with the expansion of the universe. If the universe had not expanded, it would have remained in the original state of thermodynamic equilibrium. We would not be here.
To visualize the departure from equilibrium that made us possible, remember that equilibrium is when particles are distributed evenly in all possible locations in space, and with their velocities distributed by a normal law - the Maxwell-Boltzmann velocity distribution. (The combination of position space and velocity or momentum space is called phase space). When we open the perfume bottle, the molecules now have a much larger phase space to distribute into. There are a much larger number of phase space "cells" in which molecules could be located. It of course takes them time to spread out and come to a new equilibrium state (the Boltzmann "relaxation time.")
When the universe expands, say grows to ten times its volume, it is just like the perfume bottle opening. The matter particles must redistribute themselves to get back to equilibrium. But suppose the universe expansion rate is much faster than the relaxation time. The universe is out of equilibrium, and it will never get back!
In the earliest moments of the universe, material particles were yet stable. Pure radiation energy was in equilibrium at extraordinarily high temperatures. When material particles appeared, they were blasted back into radiation by photon collisions. As the universe expanded, the temperature cooled, the space per photon was increased and the mean free time between photon collisions increased, giving particles a better chance to survive. The expansion red-shifted the photons. The average energy per photon decreased, eventually reducing the number of high energy photons that destroyed the matter. Quarks and electrons became more common. The mean free path of photons was very short. They were being scattered by collisions with electrons.
When temperatures continued to decline, quarks combined into nuclear particles, protons and neutrons. When temperature declined further, to 5000 degrees, about 400,000 years after the "Big Bang," the electrons and protons combined to make hydrogen atoms.
At this time, a major event occurred that we can still see today, the farthest and earliest event visible. When the electrons combined into atoms, the electrons could no longer scatter the photons as easily. The universe became transparent for the photons. Some of those photons are still arriving at the earth today. They are now red-shifted and cooled down to the cosmic microwave background radiation. While this radiation is almost perfectly uniform, it shows very small fluctuations that may be caused by random difference in the local density of the original radiation or even in random quantum fluctuations.
These fluctuations mean that there were slight differences in density of the newly formed hydrogen gas clouds. The force of universal gravitation then worked to pull relatively formless matter into spherically symmetric stars and planets, the original order out of chaos (although this phrase is now most associated with the work on deterministic chaos theory and complexity theory, as we shall see.
How information creation and negative entropy flows appear to violate the second law of thermodynamics
In our open and rapidly expanding universe, the maximum possible entropy (if the particles were "relaxed" into a uniform distribution among the new phase-space cells) is increasing faster than the actual entropy. The difference between maximum possible entropy and the current entropy is called negative entropy. There is an intimate connection between the physical quantity negative entropy and abstract immaterial information, first established by Leo Szilard in 1929.
As pointed out by Harvard cosmologist David Layzer, the Arrow of Time points not only to increasing disorder but also to increasing information.
Two of our "ergodic" phenomena - gravity and quantum cooperative phenomena - pull matter together that was previously separated. Galaxies, stars, and planets form out of inchoate clouds of dust and gas. Gravity binds the matter together. Subatomic particles combine to form atoms. Atoms combine to form molecules. They are held together by quantum mechanics. In all these cases, a new visible information structure appears.
In order for these structures to stay together, the motion (kinetic) energy of their parts must be radiated away. This is why the stars shine. When atoms join to become molecules, they give off photons. The new structure is now in a (negative) bound energy state. It is the radiation that carries away the positive entropy (disorder) needed to balance the new order (information) in the visible structure.
In the cases of chaotic dissipative structures and life, the ergodic phenomena are more complex, but the result is similar, the emergence of visible information. (More commonly it is simply the maintenance of high-information, low-entropy structures.) These cases appear in far-from-equilibrium situations where there is a flow of matter and energy with negative entropy through the information structure. The flow comes in with low entropy but leaves with high entropy. Matter and energy are conserved in the flow, but information in the structure can increase (information is not a conserved quantity).
Information is neither matter nor energy, though it uses matter when it is embodied and energy when it is communicated. Information is immaterial.
This vision of life as a visible form through which matter and energy flow was first seen by Ludwig van Bertlanffy in 1939, though it was made more famous by Erwin Schrödinger's landmark essay What Is Life? in 1945, where he claimed that "life feeds on negative entropy."
Both Bertalanffy and Schrödinger knew that the source of negative entropy was our Sun. Neither knew that the ultimate cosmological source of negative entropy is the expansion of the universe, which allowed ergodic gravitation forces to form the Sun. Note the positive entropy leaving the Sun becomes diluted as it expands, creating a difference between its energy temperature and energy density. This difference is information (negative entropy) that planet Earth uses to generate and maintain biological life.
Note that the 273K (the average earth temperature) photons are dissipated into the dark night sky, on their way to the cosmic microwave background. The Sun-Earth-night sky is a heat engine, with a hot energy source and cold energy sink, that converts the temperature difference not into mechanical energy (work) but into biological energy (life).
When information is embodied in a physical structure, two physical processes must occur.
The first process is the collapse of a quantum-mechanical wave function into one of the possible states in a superposition of states, which happens in any measurement process. A measurement produces one or more bits of information. Such quantum events involve irreducible indeterminacy and chance, but less often noted is the fact that quantum physics is directly responsible for the extraordinary temporal stability and adequate determinism of most information structures.
We can call the transfer of positive entropy, which stabilizes the new information from Process 1, Process 1b.
The second process is a local decrease in the entropy (which appears to violate the second law of thermodynamics) corresponding to the increase in information. Entropy greater than the information increase must be transferred away from the new information, ultimately to the night sky and the cosmic background, to satisfy the second law.
Given this new stable information, to the extent that the resulting quantum system can be approximately isolated, the system will deterministically evolve according to von Neumann's Process 2, the unitary time evolution described by the Schrödinger equation.
The first two physical processes (1 and 1b) are parts of the information solution to the "problem of measurement," to which must be added the role of the "observer."
The discovery and elucidation of the first two as steps in the cosmic creation process casts light on some classical problems in philosophy and physics , since it is the same two-step process that creates new biological species and explains the freedom and creativity of the human mind.
The cosmic creation process generates the conditions without which there could be nothing of value in the universe, nothing to be known, and no one to do the knowing. Information itself is the ultimate sine qua non.
the order out of chaos when the randomly distributed matter in the early universe first gets organized into information structures.
This was not possible before the first atoms formed about 400,000 years after the Big Bang. Information structures like the stars and galaxies did not exist before about 400 million years. As we saw, gravitation was the principal driver creating information structures.
Nobel prize winner Ilya Prigogine discovered another ergodic process that he described as the "self-organization" of "dissipative structures." He popularized the slogan "order out of chaos" in an important book. Unfortunately, the "self" in self-organization led to some unrealizable hopes in cognitive psychology. There is no self, in the sense of a person or agent, in these physical phenomena.
Both gravitation and Prigogine's dissipative systems produce a purely physical/material kind of order. The resulting structures contain information. There is a "steady state" flow of information-rich matter and energy through them. But they do not process information. They have no purpose, no "telos."
Order out of chaos can explain the emergence of downward causation on their atomic and molecular components. But this is a gross kind of downward causal control. Explaining life and mind as "complex adaptive systems" has not been successful. We need to go beyond "chaos and complexity" theories to teleonomic theories.
the order out of order when the material information structures form self-replicating biological information structures. These are information processing systems.
In his famous essay, "What Is Life?," Erwin Schrödinger noted that life "feeds on negative entropy" (or information). He called this "order out of order."
This kind of biological processing of information first emerged about 3.5 billion years ago on the earth. It continues today on multiple emergent biological levels, e.g., single-cells, multi-cellular systems, organs, etc., each level creating new information structures and information processing systems not reducible to (caused by) lower levels and exerting downward causation on the lower levels.
And this downward causal control is extremely fine, managing the motions and arrangements of individual atoms and molecules.
Biological systems are cognitive systems, using internal "subjective" knowledge to recognize and interact with their "objective" external environment, communicating meaningful messages to their internal components and to other individuals of their species with a language of arbitrary symbols, taking actions to maintain themselves and to expand their populations by learning from experience.
With the emergence of life, "purpose" also entered the universe. It is not the pre-existent "teleology" of many idealistic philosophies (the idea of "essence" before "existence"), but it is the "entelechy" of Aristotle, who saw that living things have within them a purpose, an end, a "telos." To distinguish this evolved telos in living systems from teleology, modern biologists use the term "teleonomy."
the pure information out of order when organisms with minds generate, store (in the brain), replicate, utilize, and then externalize some non-biological information, communicating it to other minds and storing it in the environment. Communication can be by hereditary genetic transmission or by an advanced organism capable of learning and then teaching its contemporaries directly by signaling, by speaking, or indirectly by writing and publishing the knowledge for future generations.
This kind of information can be highly abstract mind-stuff, pure Platonic ideas, the stock in trade of philosophers. It is neither matter nor energy (though embodied in the material brain), a kind of pure spirit or ghost in the machine. It is a candidate for the immaterial dualist "substance" of René Descartes, though it is probably better thought of as a "property dualism," since information is an immaterial property of all matter.
The information stored in the mind is not only abstract ideas. It contains a recording of the experiences of the individual. In principle every experience may be recorded, though not all may be reproducible/recallable.
The negative entropy (order, or potential information) generated by the universe expansion is a tiny amount compared to the increase in positive entropy (disorder). Sadly, this is always the case when we try to get "order out of order," as can be seen by studying entropy flows at different levels of emergent phenomena.
In any process, the positive entropy increase is always at least equal to, and generally orders of
magnitude larger than, the negative entropy in any created information structures, to satisfy the second law of thermodynamics. The positive entropy is named for Boltzmann, since it was his "H-Theorem" that proved entropy can only increase overall - the second law of thermodynamics. And negative entropy is called Shannon, since his theory of information communication has exactly the same mathematical formula as Boltzmann's famous principle;
S = k log W
where S is the entropy, k is Boltzmann's constant, and W is the probability of the given state of the system.
Information flows into Boltzmann and Shannon Entropy
Material particles are the first information structures to form
in the universe.. They are quarks, baryons, and atomic nuclei,
which combine with electrons to form atoms and eventually molecules,
when the temperature is low enough. These particles are
attracted by the force of universal gravitation to form the gigantic
information structures of the galaxies, stars, and planets.
Cosmological information flows
Microscopic quantum mechanical particles and huge self-gravitating
systems are stable and have extremely long lifetimes,
thanks in large part to quantum stability.
Stars are another source of radiation, after the original Big Bang
cosmic source, which has cooled down to 3 degrees Kelvin (3°K)
and shines as the cosmic microwave background radiation.
Sun to Earth Entropy Flows
Our solar radiation has a high color temperature (5000K) and
a low energy-content temperature (273K). It is out of equilibrium
and it is the source of all the information-generating negative
entropy that drives biological evolution on the Earth. Note that
the fraction of the light falling on Earth is less than a billionth of
that which passes by and is lost in space.
A tiny fraction of the solar energy falling on the earth gets converted
into the information structures of plants and animals. Most
of it gets converted to heat and is radiated away as waste energy to
the night sky.
Entropy Flows into Life
Every biological structure is a quantum mechanical structure.
DNA has maintained its stable information structure over billions
of years in the constant presence of chaos and noise.
Entropy Flows in a Human Being
The stable information content of a human being survives many
changes in the material content of the body during a person’s lifetime.
Only with death does the mental information (spirit, soul)
dissipate - unless it is saved somewhere.
The total mental information in a living human is orders of
magnitude less than the information content and information
processing rate of the body. But the information structures created
by humans outside the body, in the form of external knowledge
like this book, and the enormous collection of human artifacts,
rival the total biological information content.
The Shannon Principle
In his development of the mathematical theory of the communication of information, Claude Shannon showed that there can be no new information in a message unless there are multiple possible messages. If only one message is possible, there is no information in that message.
We can simplify this to define the Shannon Principle. No new information can be created in the universe unless there are multiple possibilities, only one of which can become actual.
An alternative statement of the Shannon principle is that in a deterministic system, information is conserved, unchanging with time. Classical mechanics is a conservative system that conserves not only energy and momentum but also conserves the total information. Information is a "constant of the motion" in a determinist world.
Quantum mechanics, by contrast, is indeterministic. It involves irreducible ontological chance. An isolated quantum system is described by a wave function ψ which evolves according to the unitary time evolution of the linear Schrödinger equation,
i ℏ d | ψ > / dt = H | ψ >.
But isolation is an ideal that can only be approximately realized. Because the Schrödinger equation is linear, a wave function | ψ > can be a linear combination (a superposition) of another set of wave functions | φn >,
| ψ > = ∑ cn | φn >,
where the cn coefficients squared are the probabilities of finding the system in the possible state | φn > as the result of an interaction with another quantum system.
cn2 = < ψ | φn >2.
Quantum mechanics introduces real possibilities, each with a calculable probability of becoming an actuality, as a consequence of one quantum system interacting (for example colliding) with another quantum system.
It is quantum interactions that lead to new information in the universe - both new information structures and information processing systems. But that new information cannot subsist unless a compensating amount of entropy is transferred away from the new information.
And it is only in cases where information persists long enough for a human being to observe it that we can properly describe the observation as a "measurement" and the human being as an "observer." Following von Neumann's "process" terminology, we might complete his admittedly unsuccessful attempt at a theory of the measuring process with the anthropomorphic Process 3 - a conscious observer recording new information (knowledge) in a human mind.
In less than two decades of the mid-twentieth century, the word information was transformed from a synonym for knowledge into a mathematical, physical, and biological quantity that can be measured and studied scientifically.
In 1929, Leo Szilard connected an increase in thermodynamic (Boltzmann) entropy with any increase in information that results from a measurement, solving the problem of "Maxwell's Demon," a thought experiment suggested by James Clerk Maxwell, in which a local reduction in entropy is possible when an intelligent being interacts with a thermodynamic system.
In the early 1940s, digital computers were invented, by Alan Turing, Claude Shannon, John von Neumann, and others, that could run a stored program to manipulate stored data.
Then in the late 1940s, the problem of communicating digital data signals in the presence of noise was first explored by Shannon, who developed the modern mathematical theory of the communication of information. Norbert Wiener wrote in his 1948 book Cybernetics that "information is the negative of the quantity usually defined as entropy," and in 1949 Leon Brillouin coined the term "negentropy."
Finally, in the early 1950s, inheritable characteristics were shown by Francis Crick, James Watson, and George Gamow to be transmitted from generation to generation in a digital code.
Information is Immaterial
Information is neither matter nor energy, but it needs matter for its embodiment and energy for its communication.
A living being is a form through which passes a flow of matter and energy (with low entropy). Genetic information is used to build the information-rich matter into an information-processing structure that contains a very large number of hierarchically organized information structures.
All biological systems are cognitive, using their internal information structure to guide their actions. Even some of the simplest organisms can learn from experience. The most primitive minds are experience recorders and reproducers.
In humans, the information-processing structures create new actionable information (knowledge) by consciously and unconsciously reworking the experiences stored in the mind.
Emergent higher levels exert downward causation on the contents of the lower levels, ultimately supporting mental causation and free will.
When a ribosome assembles 330 amino acids in four symmetric polypeptide chains (globins), each globin traps an iron atom in a heme group at the center to form the hemoglobin protein. This is downward causal control of the amino acids, the heme groups, and the iron atoms by the ribosome. The ribosome is an example of Erwin Schrödinger's emergent "order out of order," life "feeding on the negative entropy" of digested food.
Notice the absurdity of the idea that the random motions of the transfer RNA molecules (green in the video at right), each holding a single amino acid (red), are carrying pre-determined information of where they belong in the protein being built.
Determinism is an emergent property and an ideal philosophical concept, unrealizable except approximately in the kind of adequate determinism that we experience in the macroscopic world, where the determining information is part of the higher-level control system.
The total information in multi-cellular living beings can develop to be many orders of magnitude more than the information present in the original cell. The creation of this new information would be impossible for a deterministic universe, in which information is constant.
Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains.
Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history.
And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.
Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.
Since the 1950's, the science of human behavior has changed dramatically from a "black box" model of a mind that started out as a "blank slate" conditioned by environmental stimuli. Today's mind model contains many "functions" implemented with stored programs, all of them information structures in the brain. The new "computational model" of cognitive science likens the brain to a computer, with some programs and data inherited and others developed as appropriate reactions to experience.
The Experience Recorder and Reproducer
The brain should be regarded less as an algorithmic computer with one or more central processing units than as a multi-channel and multi-track experience recorder and reproducer with an extremely high data rate. Information about an experience - the sights, sounds, smells, touch, and taste - is recorded along with the emotions - feelings of pleasure, pain, hopes, and fears - that accompany the experience. When confronted with similar experiences later, the brain can reproduce information about the original experience (an instant replay) that helps to guide current actions.
Information is constant in a deterministic universe. There is "nothing new under the sun." The creation of new information is not possible without the random chance and uncertainty of quantum mechanics, plus the extraordinary temporal stability of quantum mechanical structures.
It is of the deepest philosophical significance that information is based on the mathematics of probability. If all outcomes were certain, there would be no "surprises" in the universe. Information would be conserved and a universal constant, as some mathematicians mistakenly believe. Information philosophy requires the ontological uncertainty and probabilistic outcomes of modern quantum physics to produce new information.
But at the same time, without the extraordinary stability of quantized information structures over cosmological time scales, life and the universe we know would not be possible. Quantum mechanics reveals the architecture of the universe to be discrete rather than continuous, to be digital rather than analog.
Moreover, the "correspondence principle" of quantum mechanics and the "law of large numbers" of statistics ensures that macroscopic objects can normally average out microscopic uncertainties and probabilities to provide the "adequate determinism" that shows up in all our "Laws of Nature."
Information philosophy explores some classical problems in philosophy with deeper and more fundamental insights than is possible with the logic and language approach of modern analytic philosophy.
By exploring the origins of structure in the universe, information philosophy transcends humanity and even life itself, though it is not a mystical metaphysical transcendence.
Information philosophy uncovers the providential creative process working in the universe
to which we owe our existence, and therefore perhaps our reverence.
It locates the fundamental source of all values not in humanity ("man the measure"), not in bioethics ("life the ultimate good"), but in the origin and evolution of the cosmos.
Information philosophy is an idealistic philosophy, a process philosophy, and a systematic philosophy, the first in many decades. It provides important new insights into the Kantian transcendental problems of epistemology, ethics, freedom of the will, god, and immortality, as well as the mind-body problem, consciousness, and the problem of evil.
In physics, information philosophy provides new insights into the problem of measurement, the paradox of Schrödinger's Cat, the two paradoxes of microscopic reversibility and macroscopic recurrence that Josef Loschmidt and Ernst Zermelo used to criticize Ludwig Boltzmann's explanation of the entropy increase required by the second law of thermodynamics, and finally information provides a better understanding of the entanglement and nonlocality phenomena that are the basis for modern quantum cryptography and quantum computing.
Information Philosophers, as do all who would make an advance in knowledge, stand on the shoulders of giant philosophers and scientists of the past and present as we try to make modest advances in the great philosophical problems of knowledge, value, and freedom.
In the left-hand column of all pages are links to nearly three hundred philosophers and scientists who have made contributions to these great problems. Their web pages include the original contributions of each thinker, with examples of their thought, usually in their own words, and where possible in their original languages as well.
Traditional philosophy is a story about discovery of timeless truths, laws of nature, a block universe in which the future is a logical extension of the past, a primal moment of creation that starts a causal chain in which everything can be foreknown by an omniscient being. Traditional philosophy seeks knowledge in logical reasoning with clear and unchanging concepts.
Its guiding lights are thinkers like Parmenides, Plato, and Kant, who sought unity and identity, being and universals.
In traditional philosophy, the total amount of information in the conceptually closed universe is static, a physical constant of nature. The laws of nature allow no exceptions, they are perfectly causal. Everything that happens is said to have a physical cause. This is called "causal closure". Chance and change - in a deep philosophical sense - are said to be illusions.
Information philosophy, by contrast, is a story about invention, about novelty, about biological emergence and new beginnings unseen and unseeable beforehand, a past that is fixed but an ambiguous future that can be shaped by teleonomic changes in the present.
Its model thinkers are Heraclitus, Protagoras, Aristotle, and Hegel, for whom time, place, and particular situations mattered.
Information philosophy is built on probabilistic laws of nature. The fundamental challenge for information philosophy is to explain the emergence of stable information structures from primordial and ever-present chaos, to account for the phenomenal success of deterministic laws when the material substrate of the universe is irreducibly chaotic, noisy, and random, and to understand the concepts of truth, necessity, and certainty in a universe of chance, contingency, and indeterminacy.
Determinism and the exceptionless causal and deterministic laws of classical physics are the real illusions. Determinism is information-preserving. In an ideal deterministic Laplacian universe, the present state of the universe is implicitly contained in its earliest moments.
This ideal determinism does not exist. The "adequate determinism" behind the laws of nature emerged from the early years of the universe when there was only indeterministic chaos.
In a random noisy environment, how can anything be regular and appear determined? It is because the macroscopic consequences of the law of large numbers average out microscopic quantum fluctuations to provide us with a very adequate determinism.
Information Philosophy is an account of continuous information creation, a story about the origin and evolution of the universe, of life, and of intelligence from an original quantal chaos that is still present in the microcosmos. More than anything else, it is the creation and maintenance of stable information structures that distinguishes biology from physics and chemistry.
Living things maintain information in a memory of the past that they can use to shape the future. Some get it via heredity. Some learn it from experience. Others invent it!
Information Philosophy is a story about knowledge and ignorance, about good and evil, about freedom and determinism.
There is a great battle going on - between originary chaos and emergent cosmos. The struggle is between destructive chaotic processes that drive a microscopic underworld of random events versus constructive cosmic processes that create information structures with extraordinary emergent properties that include adequately determined scientific laws -
despite, and in many cases making use of, the microscopic chaos.
Created information structures range from galaxies, stars, and planets, to molecules, atoms, and subatomic particles. They are the structures of terrestrial life from viruses and bacteria to sentient and intelligent beings. And they are the constructed ideal world of thought, of intellect, of spirit, including the laws of nature, in which we humans play a role as co-creator.
Based on insights into these cosmic creation processes, the Information Philosopher proposes three primary ideas that are new approaches to perennial problems in philosophy. They are likely to change some well-established philosophical positions. Even more important, they may reconcile idealism and materialism and provide a new view of how humanity fits into the universe.
The three ideas are
An explanation or epistemological model of knowledge formation and communication. Knowledge and information are neither matter nor energy, but they require matter for expression and energy for communication. They seem to be metaphysical.
Briefly, we identify knowledge with actionable information in the brain-mind. We justify knowledge by behavioral studies that demonstrate the existence of information structures implementing functions in the brain. And we verify knowledge scientifically.
A basis for objective value beyond humanism and bioethics, grounded in the fundamental information creation processes behind the structure and evolution of the universe and the emergence of life.
Briefly, we find positive value (or good) in information structures. We see negative value (or evil) in disorder and entropy tearing down such structures. We call energy with low entropy "Ergo" and call anti-entropic processes "ergodic."
Our first categorical imperative is then "act in such a way as to create, maintain, and preserve information as much as possible against destructive entropic processes."
Our second ethical imperative is "share knowledge/information to the maximum extent." Like love, our own information is not diminished when we share it with others
Our third moral imperative is "educate (share the knowledge of what is right) rather than punish." Knowledge is virtue. Punishment wastes human capital and provokes revenge.
Briefly, we separate "free" and "will" in a two-stage process - first the free generation of alternative possibilities for action, then an adequately determined decision by the will. We call this two-stage view our Cogito model and trace the idea of a two-stage model in the work of a dozen thinkers back to William James in 1884.
This model is a synthesis of adequate determinism and limited indeterminism, a coherent and completecompatibilism that reconciles free will with both determinism and indeterminism.
David Hume reconciled freedom with determinism. We reconcile free will with indeterminism.
Because it makes free will compatible with both a form of determinism (really determination) and with an indeterminism that is limited and controlled by the mind, the leading libertarian philosopher Bob Kane suggested we call this model "Comprehensive Compatibilism."
The problem of free will cannot be solved by logic, language, or even by physics. Man is not a machine and the mind is not a computer. Free will is a biophysical information problem.
All three ideas depend on understanding modern cosmology, physics, biology, and neuroscience, but especially the intimate connection between quantum mechanics and the second law of thermodynamics that allows for the creation of new information structures.
All three are based on the theory of information, which alone can establish the existential status of ideas, not just the ideas of knowledge, value, and freedom, but other-worldly speculations in natural religion like God and immortality.
All three have been anticipated by earlier thinkers, but can now be defended on strong empirical grounds. Our goal is less to innovate than to reach the best possible consensus among philosophers living and dead, an intersubjective agreement between philosophers that is the surest sign of a knowledge advance in natural science.
This Information Philosopher website aims to be an open resource for the best thinking of philosophers and scientists on these three key ideas and a number of lesser ideas that remain challenging problems in philosophy - on which information philosophy can shed some light.
Among these are the mind-body problem (the mind can be seen as the realm of information in its free thoughts, the body an adequately determined biological system creating and maintaining information); the common sense intuition of a cosmic creative process often anthropomorphized as a God or divine Providence; the problem of evil (chaotic entropic forces are the devil incarnate); and the "hard problem" of consciousness (agents responding to their environment, and originating new causal chains, based on information processing).
Philosophy is the love of knowledge or wisdom. Information philosophy (I-Phi or ΙΦ) quantifies knowledge as actionable information.
What is information that merits its use as the foundation of a new method of inquiry?
Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and energy for its communication. Information is the modern spirit, the ghost in the machine. It is the stuff of thought, the immaterial substance of philosophy.
Over 100 years ago, Bertrand Russell, with the help of G. E. Moore, Alfred North Whitehead, and Ludwig Wittgenstein, proposed logic and language as the proper foundational basis, not only of philosophy, but also of mathematics and science. Their logical positivism and the variation called logical empiricism developed by Rudolf Carnap and the Vienna Circle have proved to be failures in grounding philosophy, mathematics, or science.
Information is a powerful diagnostic tool. It is a better abstract basis for philosophy, and for science as well, especially physics, biology, and neuroscience. It is capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), and idealism itself.
Information philosophy is not a solution to specific problems in philosophy. I-Phi is a new philosophical method, capable of solving multiple problems in both philosophy and physics. It needs young practitioners, presently tackling some problem, who might investigate that problem using this new methodology. Note that, just as the philosophy of language is not linguistic philosophy, I-Phi is not the philosophy of information, which is mostly about computers and cognitive science.
The language philosophers of the twentieth century thought that they could solve (or at least dissolve) the classical problems of philosophy. They did not succeed. Information philosophy, by comparison, now has cast a great deal of light on some of those problems. It needs more information philosophers to make more progress.
To recap, when information is stored in any structure, two fundamental physical processes occur. First is a "collapse" of a quantum mechanical wave function, reducing multiple possibilities to a single actuality. Second is a local decrease in the entropy corresponding to the increase in information. Entropy greater than that must be transferred away from the new information structure to satisfy the second law of thermodynamics.
These quantum level processes are susceptible to noise. Information stored may have errors. When information is retrieved, it is again susceptible to noise. This may garble the information content. In information science, noise is generally the enemy of information. But some noise is the friend of freedom, since it is the source of novelty, of creativity and invention, and of variation in the biological gene pool.
Biological systems have maintained and increased their invariant information content over billions of generations, coming as close to immortality as living things can. Philosophers and scientists have increased our knowledge of the external world, despite logical, mathematical, and physical uncertainty. They have created and externalized information (knowledge) that can in principle become immortal. Both life and mind create information in the face of noise. Both do it with sophisticated error detection and correction schemes. The scheme we use to correct human knowledge is science, a two-stage combination of freely invented theories and adequately determined experiments. Information philosophy follows that example.
If you have read this far, you probably already know that the Information Philosopher website is an exercise in information sharing. It has seven parts, each with multiple chapters. Navigation at the bottom of each page will take you to the next or previous part or chapter.
Teacher and Scholar links display additional material on some pages, and reveal hidden footnotes on some pages. The footnotes themselves are in the Scholar section.
Our goal is for the website to contain all the great philosophical discussions of our three main ideas, plus preliminary solutions for several classic problems in philosophy and physics, with primary source materials (in the original languages) where possible.
Philosophers who would like to develop their expertise in information philosophy should inquire into support possibilities by writing Bob Doyle, the founder of information philosophy.
Support options include online training sessions by Skype or Google Hangouts, perhaps published to YouTube.
Preferences will be given to current graduate students in philosophy or science - physics, biology, psychology, especially - and current post-docs.
All original content on Information Philosopher is available for your use, without requesting permission, under a Creative Commons Attribution License.
Copyrights for all excerpted and quoted works remain with their authors and publishers.
A web page may contain two extra levels of material. The Normal page is material for newcomers and students of the Information Philosophy. Two hidden levels contain material for teachers (e.g., secondary sources) and for scholars (e.g., footnotes, and original language quotations).
Teacher materials on a page will typically include references to secondary sources and more extended explanations of the concepts and arguments. Secondary sources will include books, articles, and online resources. Extended explanations should be more suitable for teaching others about the core philosophical ideas, as seen from an information perspective.
For Scholars
To hide this material, click on the Teacher or Normal link.
Scholarly materials will generally include more primary sources, more in-depth technical and scientific discussions where appropriate, original language versions of quotations, and references to all sources.
Footnotes for a page appear in the Scholar materials. The footnote indicators themselves are only visible in Scholar mode.