NEW! The I-Phi website is now available in multiple languages using Google's powerful statistical translation software that provides non-English speakers with a gist of the page
Information is the fundamental metaphysical connection between idealism and materialism. Information philosophy replaces the metaphysical necessity of reductionist naturalism and eliminative materialism with genuine metaphysical possibility.
Information is the form in all concrete objects as well as the content in non-existent, merely possible, abstract entities.
It is the disembodied, de-materialized essence of anything.
The Freedom section of Information Philosopher is now a book. Click here for info
Information is neither matter nor energy, although it needs matter to be embodied and energy to be communicated.
Matter and energy are conserved. There is just the same total amount of matter and energy today as there was at the universe origin.
But information is not conserved. It has been increasing since the beginning of time. Everything emergent is new information. What idealists and holists see is the emergence of immaterial information.
Living things are dynamic and growing information structures, forms through which matter and energy continuously flow. And it is information that controls those flows!
The August 2016 draft of
Great Problems in Philosophy and Physics SOLVED? is available now Click here for info
The metaphysics chapter in "Great Problems" has exploded into a new website that we call The Metaphysicist
Information is the modern spirit, the ghost in the machine, the mind in the body. It is the soul, and when we die, it is our information that perishes, unless the future preserves it. The matter remains.
If we don't remember the past, we don't deserve to be remembered by the future. This is especially true for philosophers.
Information philosophy goes beyond a priori logic and its puzzles, beyond analytic language and its paradoxes, beyond philosophical claims of necessary truths, to a contingent physical world that is best represented as made of dynamic, interacting information structures.
Knowledge begins with information in minds that is a partial isomorphism (mapping) of the information structures in the external world. Information philosophy is a correspondence theory.
There is no isomorphism, no information in common, between words and objects. This is the major failing of much philosophy today.
Although language is an excellent tool for human communication, its arbitrary and ambiguous nature makes it ill-suited to represent the world directly. Language does not picture reality.
The extraordinarily sophisticated connection between words and objects is made in human minds, mediated by the brain's experience recorder and reproducer (ERR). Words stimulate neurons to start firing and to play back relevant experiences that include the objects.
By contrast, a dynamic information model of an information structure in the world is presented immediately to the mind as a look-alike and act-alike simulation, which is experienced for itself, not through words.
Without words and related experiences previously recorded in your mental experience recorder, you could not comprehend spoken or written words. They would be mere noise, with no meaning.
CAT
By comparison, a photograph or a moving picture with sound can be seen and mostly understood by human beings, independent of their native tongue. The elements of information philosophy are dynamical models of information structures. They go far beyond logic and language as a representation of the fundamental, metaphysical, nature of reality. Visual and interactive models "write" directly into our mental experience recorders.
Computer animated models must incorporate all the laws of nature, from the differential equations of quantum physics to the myriad processes of biology. At their best, simulations are not only our most accurate knowledge of the physical world, they are the best teaching tools ever devised. We can transfer knowledge non-verbally to coming generations and most of the world's population via the Internet and ubiquitous smartphones.
Consider the dense information in Drew Berry's real-time animations of molecular biology. These are the kind of dynamic models of information structures that can best explain the fundamental nature of reality.
If you think about it, everything you know is pure abstract information. Everything you are is an information structure, a combination of matter and energy that embodies, communicates, and most important, processes your information. Everything that you value contains information.
And while the atoms, molecules, and cells of your body are important, many only last a few minutes and most are completely replaced in just a few years. But your immaterial information, from your original DNA to your latest experiences, will be with you for your lifetime.
You are a creator of new information, part of the cosmic creation process. Your free will depends on your unique ability to create freely generated thoughts, multiple ideas in your mind as alternative possibilities for your willed decisions and responsible actions.
Anyone with a serious interest in philosophy should understand how information is created and destroyed, because information is much more fundamental than the logic and language tools philosophers use today. Information philosophy goes "beyond logic and language." The goal of the information philosopher is to add information analysis to every philosopher's toolbox. This I-Phi website aims to contain all that you need to learn about information.
We will show why information should actually be the preferred basis for the critical analysis of current problems in a wide range of disciplines - from information creation in cosmology to information in quantum physics, from information in biology (especially evolution) to psychology, where it offers a solution to the classic mind-body problem and the problem of consciousness. And of course in philosophy, where failed language analysis can be replaced or augmented by immaterial information analysis as a basis for justified knowledge, objective values, human free will, and a surprisingly large number of problems in metaphysics.
Above all, information philosophy hopes to replace beliefs with knowledge. We hope to replace the idea of an other-worldly creator with an explanation of the creation of this world that has evolved into the human creativity that invents such ideas. The "miracle of creation" is happening now, in the universe and in you and by you.
But what is information? How is it created? Why is it a better tool for examining philosophical problems than traditional logic or linguistic analysis? And what are some examples of classic problems in philosophy, in physics, and in metaphysics, for which information philosophy proposes solutions?
What problems has information philosophy solved?
Why has philosophy made so little progress? Is it because philosophers prefer problems, while scientists seek solutions? Must a philosophical problem once solved become science and leave philosophy? Bertrand Russell thought so. The information philosopher thinks not.
Russell said:
"as soon as definite knowledge concerning any subject becomes possible, this subject ceases to be called philosophy, and becomes a separate science...while those only to which, at present, no definite answer can be given, remain to form the residue which is called philosophy."
(The Problems of Philosophy, 1912, pp.89-90)
But in order to remain philosophy, interested philosophers must examine our proposed information-based solutions and evaluate them as part of the philosophical dialogue.
Among the proposed solutions to classic philosophical problems are:
It also turns out that the methodology of information philosophy can be productively applied to some outstanding problems in physics. Philosophers of science might take an interest in the proposed information-based solutions to these problems in the "foundations" of physics.
A common definition of information is the act of informing - the communication of knowledge from a sender to a receiver that informs (literally shapes) the receiver. As a synonym for knowledge, information traditionally implies that the sender and receiver are human beings, but many animals clearly communicate. Information theory studies the communication of information.
Information philosophy extends that study to the communication of information content between material objects, including how it is changed by energetic interactions with the rest of the universe.
We call a material object with information content an information structure. While information is communicated between inanimate objects, they do not process information, which we will show is the defining characteristic of living beings and their artifacts.
The sender of information need not be a person, an animal, or even a living thing. It might be a purely material object, a rainbow, for example, sending color information to your eye.
The receiver, too, might be merely physical, a molecule of water in that rainbow that receives too few photons and cools to join the formation of a crystal snowflake, increasing its information content.
Information theory, the mathematical theory of the communication of information, says little about meaning in a message, which is roughly the use to which the information received is put. Information philosophy extends the information flows in human communications systems and digital computers to the natural information carried in the energy and material flows between all the information structures in the observable universe.
A message that is certain to tell you something you already know contains no new information. It does not increase your knowledge, or reduce the uncertainty in what you know, as information theorists put it.
If everything that happens was certain to happen, as determinist philosophers claim, no new information would ever enter the universe. Information would be a universal constant. There would be "nothing new under the sun." Every past and future event could in principle be known by a god-like super-intelligence with access to such a fixed totality of information (Laplace's Demon).
Physics tells us that the total amount of mass and energy in the universe is a constant. The conservation of mass and energy is a fundamental law of nature. Some mathematical physicists erroneously think that information should also be a conserved quantity, that information is a constant of nature.
But information is neither matter nor energy, though it needs matter to be embodied and energy to be communicated. Information can be created and destroyed. The material universe creates it. The biological world creates it and utilizes it. Above all, human minds create, process, and preserve information, the sum of human knowledge that distinguishes humanity from all other biological species and that provides the extraordinary power humans have over our planet.
Information is the modern spirit, the ghost in the machine, the mind in the body. It is the soul, and when we die, it is our information that perishes. The matter remains.
We propose information as an objective value, the ultimate sine qua non.
Information philosophy claims that man is not a machine and the brain is not a computer. Living things process information in ways far more complex, if not faster, than the most powerful information processing machines. What biological systems and computing systems have in common is the processing of information, as we must explain.
Whereas machines are assembled, living things assemble themselves. They are both information structures, patterns, through which matter and energy flows, thanks to flows of negative entropy coming from the Sun and the expanding universe. And they both can create new information, build new structures, and maintain their integrity against the destructive influence of the second law of thermodynamics.
Biological evolution began when the first molecule replicated itself, that is, duplicated the information it contained. But duplication is mere copying. Biological reproduction is a much more sophisticated process in which the germ or seed information of a new living thing is encoded in a data or information structure (a genetic code) that can be communicated to processing systems that produce another instance of the given genus and species.
Ontologically random imperfections, along with the deliberate introduction of random noise, in the processing systems produce the variations that are selected by evolution based on their reproductive success. Errors are not restricted to the genetic code. They occur throughout the development of each individual up to the moment of creation of a new individual.
Cultural evolution is the creation and communication of new information that adds to the sum of human knowledge. The creation and evolution of information processing systems in the universe has culminated in minds that can understand and reflect on what we call the cosmic creation process.
How is information created?
Ex nihilo, nihil fit, said the ancients, Nothing comes from nothing. But information is no (material) thing. Information is physical, but it is not material. Information is a property of material. It is the form that matter can take. We can thus create something (immaterial) from nothing!
But we shall find that it takes a special kind of energy (free or available energy, with negative entropy) to do so, because it involves the rearrangement of matter.
Energy transfer to or from an object increases or decreases the heat in the object. Entropy transfer does not change the heat content, it represents only a different organization or distribution of the matter in the body. Increasing entropy represents a loss of organization or order, or, more precisely, information. Maximum entropy is maximum disorder and minimal information.
As you read this sentence, new information is (we hope) being encoded/embodied in your mind/brain. Permanent changes in the synapses between your neurons store the new information. New synapses are made possible by free energy and material flows in your metabolic system, a tiny part of the negative entropy flows that are coursing throughout the universe. Information philosophy will show you how these tiny mental flows allow you to comprehend and some day control at least part of the cosmic information flows in the universe.
Cosmologists know that information is being created because the universe began some thirteen billion years ago in a state of minimal information. The "Big Bang" started with the most elementary particles and radiation. How matter formed into information structures, first atoms, then the galaxies, stars, and planets, is the beginning of a story that will end with understanding how human minds emerged to understand our place in the astrophysical universe.
The relation between matter and information is straightforward. The embodied information is the organization or arrangement of the matter plus the laws of nature that describe the motions of matter in terms of the fundamental forces that act between all material particles.
The relation between information and energy is more complex, and has led to confusion about how to apply mathematical information theory to the physical and biological sciences. Material systems in an equilibrium state are maximally disordered, have maximum entropy, no negative entropy, and no information other than the bulk parameters of the system. In the case of the universe, the initial parameters were very few, the amount of radiant energy (the temperature) and the number of elementary particles (quarks, electrons) per unit volume, and the total volume (infinite?). These parameters, and their changes (as a function of time, as the temperature falls) are all the information needed to describe a statistically uniform, isotropic universe and its evolution.
Information philosophy will explain the process of information creation in three fundamental realms - the purely material, the biological, and the mental.
The first information creation was a kind of "order out of chaos," when matter in the early universe opened up spaces allowing gravitational attraction to condense otherwise randomly distributed matter into highly organized galaxies, stars, and planets. It was the expansion - the increasing space between material objects - that drove the universe away from thermodynamic equilibrium (maximum entropy and disorder) and in some places created negative entropy, a quantitative measure of orderly arrangements that is the basis for all information.
Purely material objects react to one another following laws of nature, but they do not in an important sense create or process the information that they contain. It was the expansion, moving faster than the re-equilibration time, and the gravitational forces, that were responsible for the new structures.
A qualitatively different kind of information creation was when the first molecule on earth to replicate itself went on to duplicate its information exponentially. Here the prototype of life was the cause for the creation of the new information structure. Accidental errors in the duplication provided variations in replicative success. Most important, besides creating their information structures, biological systems are also information processors. Living things use information to guide their actions.
With the appearance of life, agency and purpose appeared in the universe. Although some philosophers hold that life just gives us the "appearance of purpose."
The third process of information creation, and the most important to philosophy, is human creativity. Almost every philosopher since philosophy began has considered the mind as something distinct from the body. Information philosophy can now explain that distinction. The mind can be considered the immaterial information in the brain. The brain, part of the material body, is a biological information processor. The stuff of mind is the information being processed and the new information being created. As some philosophers have speculated, mind is the software in the brain hardware.
Material things are information structures.
Living things are information structures that actively process information. They communicate it between their parts to build, maintain and repair their (material) information structures. Resisting the second law locally, they actually increase entropy globally much faster than non-living things. But most important, living things increase their information content as they develop. Humans learn from their experiences.
Mental things (ideas) are pure abstractions from the material world, but they have power over the material and biological worlds that enables agent causality. Human minds create information structures, but their unique creation is the collection of abstract ideas that are the sum of human knowledge. It is these ideas that give humanity unparalleled extraordinary control over the material and biological worlds.
It may come as a surprise for many thinkers to learn that the physics involved in the creation of all three types of information - material, biological, and mental - include the same two-step sequence of quantum physics and thermodynamics at the core of the cosmic creation process.
The most important information created in a mind is a recording of an individual's experiences (sensations). Recordings are played back (automatically and perhaps mostly unconsciously) as a guide to evaluate future actions (volitions) in similar situations. The particular past experiences reproduced are those stored in the brain located near elements of the current experience (association of ideas). Just as neurons that fire together wire together, neurons that have been wired together will later fire together.
Sensations are recorded as the mental effects of physical causes.
Sensations are stored as retrievable information in the mind of an individual self. Recordings include not only the five afferent senses but also the internal emotions - feelings of pleasure, pain, hopes, and fears - that accompany an experience. They constitute "what it's like" for a particular being to have an experience.
Volitions are the mental causes of physical effects. Volitions begin with 1) the reproduction of past experiences that are similar to the current experience. These become thoughts about possible actions and the (partly random) generation of other alternative possibilities for action. They continue with 2) the evaluation of those freely generated thoughts followed by a willful selection (sometimes habitual) of one of those actions.
Volitions are followed by 3) new sensations coming back to the mind indicating that the self has caused the action to happen (or not). This feedback is recorded as further retrievable information, reinforcing the knowledge stored in the mind that the individual self can cause this kind of action (or sometimes not).
Many philosophers and most scientists have held that all knowledge is based on experience. Experience is ultimately the product of human sensations, and sensations are just electrical and chemical interactions with human skin and sense organs. But what of knowledge that is claimed to be mind-independent and independent of experience?
Why is information better than logic and language for solving philosophical problems?
Broadly speaking, modern philosophy has been a search for truth, for a priori, analytic, certain, necessary, and provable truth.
But all these concepts are mere ideas, invented by humans, some aspects of which have been discovered to be independent of the minds that invented them, notably formal logic and mathematics. Logic and mathematics are systems of thought, inside which the concept of demonstrable (apodeictic) truth is useful, but with limits set by Kurt Gödel's incompleteness theorem. The truths of logic and mathematics appear to exist "outside of space and time." Gottfried Leibniz called then "true in all possible worlds," meaning their truth is independent of the physical world. We call them a priori because their proofs are independent of experience, although they were initially abstracted from concrete human experiences.
Analyticity is the idea that some statements, propositions in the form of sentences, can be true by the definitions or meanings of the words in the sentences. This is correct, though limited by verbal difficulties such as Russell's paradox and numerous other puzzles and paradoxes. Analytic language philosophers claim to connect the words with objects, material things, and thereby tell us something about the world. Some modal logicians (cf. Saul Kripke) claim that words that are names of things are necessary a posteriori, "true in all possible worlds." But this is nonsense, because we invented all those words and worlds. They are mere ideas.
Perhaps the deepest of all these philosophical ideas is necessity. Information philosophy can now tell us that there is no such thing as absolute necessity. There is of course an adequate determinism in the macroscopic world that explains the appearance of deterministic laws of nature, of cause and effect, for example. This is because macroscopic objects consist of vast numbers of atoms and their individual random quantum events average out. But there is no metaphysical necessity. At the fundamental microscopic level of material reality, there is an irreducible contingency and indeterminacy. Everything that we know, everything we can say, is fundamentally empirical, based on factual evidence, the analysis of experiences that have been recorded in human minds.
So information philosophy is not what we can logically know about the world, nor what we can analytically say about the world, nor what is necessarily the case in the world. There is nothing that is the case that is necessary and perfectly determined by logic, by language, or by the physical laws of nature. Our world and its future are open and contingent, with possibilities that are the source of new information creation in the universe and source of human freedom.
For the most part, philosophers and scientists do not believe in ontological possibilities, despite their invented "possible worlds," which are on inspection merely multiple "actual worlds." They are "actualists." This is because they cannot accept the idea of ontological chance. They hope to show that the appearance of chance is the result of human ignorance, that chance is merely an epistemic phenomenon.
Now chance, like truth, is just another idea, just some more information. But what an idea! In a self-referential virtuous circle, it turns out that without the real possibilities that result from ontological chance, there can be no new information. Information philosophy offers cosmological and biological evidence for the creation of new information in the universe. So it follows that chance is real, fortunately something that we can keep under control. We are biological beings that have evolved, thanks to chance, from primitive single-cell communicating information structures to multi-cellular organisms whose defining aspect is the creation and communication of information.
The theory of communication of information is the foundation of our "information age." To understand how we know things is to understand how knowledge represents the material world of embodied "information structures" in the mental world of immaterial ideas.
All knowledge starts with the recording of experiences. The experiences of thinking, perceiving, knowing, believing, feeling, desiring, deciding, and acting may be bracketed by philosophers as "mental" phenomena, but they are no less real than other "physical" phenomena. They are themselves physical phenomena. They are just not material things.
Information philosophy defines human knowledge as immaterial information in a mind, or embodied in an external artifact that is an information structure (e.g., a book), part of the sum of all human knowledge. Information in the mind about something in the external world is a proper subset of the information in the external object. It is isomorphic to a small part of the total information in or about the object. The information in living things, artifacts, and especially machines, consists of much more than the material components and their arrangement (positions over time). It also consists of all the information processing (e.g., messaging) that goes on inside the thing as it realizes its entelechy or telos, its internal or external purpose.
All science begins with information gathered from experimental observations, which are themselves mental phenomena. Observations are experiences recorded in minds. So all knowledge of the physical world rests on the mental. All scientific knowledge is information shared among the minds of a community of inquirers. As such, science is a collection of thoughts by thinkers, immaterial and mental, some might say fundamental. Recall Descartes' argument that the experience of thinking is that which for him is the most certain.
Information philosophy is not the philosophy of information (the intersection of computer science, information science, information technology, and philosophy), just as linguistic philosophy - the idea that linguistic analysis can solve (or dis-solve) philosophical problems - is not the philosophy of language. Compare the philosophy of mathematics, philosophy of biology, etc.
The analysis of language, particularly the analysis of philosophical concepts, which dominated philosophy in the twentieth century, has failed to solve the most ancient philosophical problems. At best, it claims to "dis-solve" some of them as conceptual puzzles. The "problem of knowledge" itself, traditionally framed as "justifying true belief," is recast by information philosophy as the degree of isomorphism between the information in the physical world and the information in our minds. Information psychology can be defined as the study of this isomorphism.
We shall see how information processes in the natural world use arbitrary symbols (e.g., nucleotide sequences) to refer to something, to communicate messages about it, and to give the symbol meaning in the form of instructions for another process to do something (e.g., create a protein). These examples provide support for both theories of meaning as reference and meaning as use.
Note that just as language philosophy is not the philosophy of language, so information philosophy is not the philosophy of information. It is rather the use of information as a tool to study philosophical problems, some of which are today yielding tentative solutions. It is time for philosophy to move beyond logical puzzles and language games.
Our fundamental philosophical question is cosmological and ultimately metaphysical.
What are the processes that create emergentinformation structures in the universe?
Given the second law of thermodynamics, which says that any system will over time approach a thermodynamic equilibrium of maximum disorder or entropy, in which all information is lost, and given the best current model for the origin of the universe, which says everything began in a state of thermodynamic equilibrium some 13.75 billion years ago, how can it be that living beings are creating and communicating vast amounts of new information every day?
Why are we not still in that original state of equilibrium?
Broadly speaking, there are four major phenomena or processes that can reduce the entropy locally, while of course increasing it globally to satisfy the second law of thermodynamics. Three of these do it "blindly," the fourth does it with a built-in "purpose," or telos."
Universal Gravitation
Quantum Cooperative Phenomena (e.g., crystallization, the formation of atoms and molecules)
"Dissipative" Chaos (Non-linear Thermodynamics)
Life
None of these processes can work unless they have a way to get rid of the positive entropy (disorder) and leave behind a pocket of negative entropy (order or information). The positive entropy is either conducted, convected, or radiated away as waste matter and energy, as heat, or as pure radiation. At the quantum level, it is always the result of interactions between matter and radiation (photons). Whenever photons interact with material particles, the outcomes are inherently unpredictable. As Albert Einstein discovered ten years before the founding of quantum mechanics, these interactions involve irreducible ontological chance.
Negative entropy is an abstract thermodynamic concept that describes energy with the ability to do work, to make something happen. This kind of energy is often called free energy or available energy.
In a maximally disordered state (called thermodynamic equilibrium) there can be matter in motion, the motion we call heat. But the average properties - density, pressure, temperature - are the same everywhere. Equilibrium is formless. Departures from equilibrium are when the physical situation shows differences from place to place. These differences are information.
The second law of thermodynamics then simply means that isolated systems will eliminate differences from place to place until all properties are uniformly distributed. Natural processes spontaneously destroy information. Consider the classic case of what happens when we open a perfume bottle.
The perfume molecules dissipate until they are uniformly distributed. Statistical physics mistakenly claims that if the velocities of all the particles were reversed at an instant, the molecules would return to the bottle. It assumes that the complete path information needed to return to the bottle is preserved. But information is not conserved. It can be created and it can be destroyed. We shall show why such microscopic reversibility is extremely unlikely.
In the late nineteenth century, Ludwig Boltzmann revolutionized thermodynamics with his kinetic theory of gases, based on the ancient assumption that matter is made up of collections of atoms. He derived a mathematical formula for entropy S as a function of the probabilities of finding a system in all the possible microstates of a system. When the actual macrostate is one with the largest number W of microstates, entropy is at a maximum, and no differences (information) are visible.
Boltzmann could not prove his "H-Theorem" about entropy increase. His contemporaries challenged a "statistical" entropy increase on grounds of microscopic reversibility and macroscopic recurrence (both problems solved by information philosophy). He could not prove the existence of atoms.
In the early twentieth century, Just before Boltzmann died, Albert Einstein formulated a statistical mechanics that put Boltzmann's law of increasing entropy on a firmer mathematical basis. Einstein's work predicted the size of miniscule fluctuations around equilibrium, which Boltzmann had expected. Einstein showed that entropy does not, in fact, continually increase. It can decrease randomly in short bursts of local higher densities or organized motions. Though quickly extinguished, Einstein showed that the occasionally correlated motions of invisible atoms explains the visible "Brownian motion" of tiny particles like seed pollen.
Einstein's calculations led to predictions that were confirmed quickly, proving the existence of discrete atoms that had been hypothesized for centuries. Sadly, Boltzmann may not have known of Einstein's proofs for his work. Later Einstein saw the same fluctuation in radiation, proving his revolutionary hypothesis of light quanta, now called photons. Although this is rarely appreciated, it was Einstein who showed that both matter and energy are discrete, discontinuous particles. His most famous equation shows they are convertible into one another, E = mc2. He also showed that the interaction of matter and radiation, of atoms and photons, always involves ontological chance. This bothered Einstein greatly, because he thought his God should not "play dice."
Late in life, Einstein said that if matter and energy cannot be described with the local continuous analytical functions in space and time needed for his field theories, that all his work would be "castles in the air." But the loss of classical deterministic ideas - which have ossified much of philosophy, crippling philosophical progress - is more than offset by the indeterministic of an open future and Einstein's belief in the "free creation of new ideas."
In the middle twentieth century, Claude Shannon derived the mathematical formula for the communication of information. John von Neumann found it to be identical to Boltzmann's formula for entropy, though with a minus sign (negative entropy). Where Boltzmann entropy is the number of possible microstates, Shannon entropy is the number of possible messages that can be communicated.
Shannon found that new information cannot be created unless there are multiple possible messages. This in turn depends on the ontological chance discovered by Einstein. In a deterministic universe, the total information at all times would be a constant. Information would be a conserved quantity, like matter and energy. "Nothing new under the Sun." But it is not constant, though many philosophers, mathematical physicists, and theologians (God's foreknowledge) still think so. Information is being created constantly in our universe. And we are co-creators of the information, including Einstein's "new ideas."
Because "negative" entropy (order or information) is such a positive quantity, we chose in the 1970's to give it a new name - "Ergo," and to call the four phenomena or processes that create negative entropy "ergodic," for reasons that will become clear. But today, the positive name "information" is all that we need to do information philosophy.
How exactly has the universe escaped from the total disorder of thermodynamic equilibrium and produced a world full of information?
It begins with the expansion of the universe. If the universe had not expanded, it would have remained in the original state of thermodynamic equilibrium. We would not be here.
To visualize the departure from equilibrium that made us possible, remember that equilibrium is when particles are distributed evenly in all possible locations in space, and with their velocities distributed by a normal law - the Maxwell-Boltzmann velocity distribution. (The combination of position space and velocity or momentum space is called phase space). When we open the perfume bottle, the molecules now have a much larger phase space to distribute into. There are a much larger number of phase space "cells" in which molecules could be located. It of course takes them time to spread out and come to a new equilibrium state (the Boltzmann "relaxation time.")
When the universe expands, say grows to ten times its volume, it is just like the perfume bottle opening. The matter particles must redistribute themselves to get back to equilibrium. But suppose the universe expansion rate is much faster than the equilibration or relaxation time. The universe is out of equilibrium, and in a flat, ever-expanding, universe it will never get back!
In the earliest moments of the universe, material particles were in equilibrium with radiation at extraordinarily high temperatures. When quarks formed neutrons and protons, they were short-lived, blasted back into quarks by photon collisions. As the universe expanded, the temperature cooled, the space per photon increased and the mean free time between photon collisions increased, giving larger particles a better chance to survive. The expansion red-shifted the photons. decreasing the average energy per photon, and eventually reducing the number of high energy photons that disassociate matter. The mean free path of photons was very short. They were being scattered by collisions with electrons.
When temperature declined further, to 5000 degrees, about 400,000 years after the "Big Bang," the electrons and protons combined to make hydrogen and (with neutrons) helium atoms.
At this time, a major event occurred that we can still see today, the farthest and earliest event visible. When the electrons combined into atoms, the electrons could no longer scatter the photons so easily. The universe became transparent for the photons. Some of those photons are still arriving at the earth today. They are now the red-shifted and cooled down cosmic microwave background radiation. While this radiation is almost perfectly uniform, it shows very small fluctuations that may be caused by random difference in the local density of the original radiation or even in random quantum fluctuations.
These fluctuations mean that there were slight differences in density of the newly formed hydrogen gas clouds. The force of universal gravitation then worked to pull relatively formless matter into spherically symmetric stars and planets. Thus is the original order out of chaos, although this phrase is now most associated with the work on deterministic chaos theory and complexity theory, as we shall see.
How information creation and negative entropy flows appear to violate the second law of thermodynamics
In our open and rapidly expanding universe, the maximum possible entropy (if the particles were "relaxed" into a uniform distribution among the new phase-space cells) is increasing faster than the actual entropy. The difference between maximum possible entropy and the current entropy is called negative entropy. There is an intimate connection between the physical quantity negative entropy and abstract immaterial information, first established by Leo Szilard in 1929.
As pointed out by Harvard cosmologist David Layzer, the Arrow of Time points not only to increasing disorder but also to increasing information.
Two of our "ergodic" phenomena - gravity and quantum cooperative phenomena - pull matter together that was previously separated. Galaxies, stars, and planets form out of inchoate clouds of dust and gas. Gravity binds the matter together. Subatomic particles combine to form atoms. Atoms combine to form molecules. They are held together by quantum mechanics. In all these cases, a new visible information structure appears.
In order for these structures to stay together, the motion (kinetic) energy of their parts must be radiated away. This is why the stars shine. When atoms join to become molecules, they give off photons. The new structure is now in a (negative) bound energy state. It is the radiation that carries away the positive entropy (disorder) needed to balance the new order (information) in the visible structure.
In the cases of chaotic dissipative structures and life, the ergodic phenomena are more complex, but the result is similar, the emergence of visible information. (More commonly it is simply the maintenance of high-information, low-entropy structures.) These cases appear in far-from-equilibrium situations where there is a flow of matter and energy with negative entropy through the information structure. The flow comes in with low entropy but leaves with high entropy. Matter and energy are conserved in the flow, but information in the structure can increase (information is not a conserved quantity).
Information is neither matter nor energy, though it uses matter when it is embodied and energy when it is communicated. Information is immaterial.
This vision of life as a visible form through which matter and free energy flow was first seen by Ludwig van Bertlanffy in 1939, though it was made more famous by Erwin Schrödinger's landmark essay What Is Life? in 1945, where he claimed that "life feeds on negative entropy."
Both Bertalanffy and Schrödinger knew that the source of negative entropy was our Sun. Neither knew that the ultimate cosmological source of negative entropy is the expansion of the universe, which allowed ergodic gravitation forces to form the Sun. Note that the positive entropy radiation leaving the Sun becomes diluted as it expands, creating a difference between its energy temperature and energy density. This difference is information (negative entropy) that planet Earth uses to generate and maintain biological life.
Note that the 300K (the average earth temperature) photons are dissipated into the dark night sky, on their way to the cosmic microwave background. The Sun-Earth-night sky is a heat engine, with a hot energy source and cold energy sink, that converts the temperature difference not into mechanical energy (work) but into biological energy (life).
When information is embodied in a physical structure, two physical processes must occur.
The first process is the collapse of a quantum-mechanical wave function into one of the possible states in a superposition of states, which happens in any measurement process. A measurement produces one or more bits of information. Such quantum events involve irreducible indeterminacy and chance, but less often noted is the fact that quantum physics is directly responsible for the extraordinary temporal stability and adequate determinism of most information structures.
We can call the transfer of positive entropy, which stabilizes the new information from Process 1, Process 1b.
The second process is a local decrease in the entropy (which appears to violate the second law of thermodynamics) corresponding to the increase in information. Entropy greater than the information increase must be transferred away from the new information, ultimately to the night sky and the cosmic background, to satisfy the second law.
Given this new stable information, to the extent that the resulting quantum system can be approximately isolated, the system will deterministically evolve according to von Neumann's Process 2, the unitary time evolution described by the Schrödinger equation.
The first two physical processes (1 and 1b) are parts of the information solution to the "problem of measurement," to which must be added the role of the "observer." We shall see that the observer involves a mental Process 3.
The discovery and elucidation of the first two as steps in the cosmic creation process casts light on some classical problems in philosophy and physics , since it is the same two-step process that creates new biological species and explains the freedom and creativity of the human mind.
The cosmic creation process generates the conditions without which there could be nothing of value in the universe, nothing to be known, and no one to do the knowing. Information itself is the ultimate sine qua non.
the order out of chaos when the randomly distributed matter in the early universe first gets organized into information structures.
This was not possible before the first atoms formed about 400,000 years after the Big Bang. Information structures like the stars and galaxies did not exist before about 400 million years. As we saw, gravitation was the principal driver creating information structures.
Nobel prize winner Ilya Prigogine discovered another ergodic process that he described as the "self-organization" of "dissipative structures." He popularized the slogan "order out of chaos" in an important book. Unfortunately, the "self" in self-organization led to some unrealizable hopes in cognitive psychology. There is no self, in the sense of a person or agent, in these physical phenomena.
Both gravitation and Prigogine's dissipative systems produce a purely physical/material kind of order. The resulting structures contain information. There is a "steady state" flow of information-rich matter and energy through them. But they do not process information. They have no purpose, no "telos."
Order out of chaos can explain the emergence of downward causation on their atomic and molecular components. But this is a gross kind of downward causal control. Explaining life and mind as "complex adaptive systems" has not been successful. We need to go beyond "chaos and complexity" theories to teleonomic theories.
the order out of order when the material information structures form self-replicating biological information structures. Some become information processing systems.
In his famous essay, "What Is Life?," Erwin Schrödinger noted that life "feeds on negative entropy" (or information). He called this "order out of order."
This kind of biological processing of information first emerged about 3.5 billion years ago on the earth. It continues today on multiple emergent biological levels, e.g., single-cells, multi-cellular systems, organs, etc., each level creating new information structures and information processing systems not reducible to (caused by) lower levels and exerting downward causation on the lower levels.
And this downward causal control is extremely fine, managing the motions and arrangements of individual atoms and molecules.
Biological systems are cognitive systems, using internal "subjective" knowledge to recognize and interact with their "objective" external environment, communicating meaningful messages to their internal components and to other individuals of their species with a language of arbitrary symbols, taking actions to maintain themselves and to expand their populations by learning from experience.
With the emergence of life, "purpose" also entered the universe. It is not the pre-existent "teleology" of many idealistic philosophies (the idea of "essence" before "existence"), but it is the "entelechy" of Aristotle, who saw that living things have within them a purpose, an end, a "telos." To distinguish this evolved telos in living systems from teleology, modern biologists use the term "teleonomy."
the pure information out of order when organisms with minds generate, store (in the brain), replicate, utilize, and then externalize some non-biological information, communicating it to other minds and storing it in the environment. Communication can be by hereditary genetic transmission or by an advanced organism capable of learning and then teaching its contemporaries directly by signaling, by speaking, or indirectly by writing and publishing the knowledge for future generations.
This kind of information can be highly abstract mind-stuff, pure Platonic ideas, the stock in trade of philosophers. It is neither matter nor energy (though embodied in the material brain), a kind of pure spirit or ghost in the machine. It is a candidate for the immaterial dualist "substance" of René Descartes, though it is probably better thought of as a "property dualism," since information is an immaterial property of all matter.
The information stored in the mind is not only abstract ideas. It contains a recording of the experiences of the individual. In principle every experience may be recorded, though not all may be reproducible/recallable.
The negative entropy (order, or potential information) generated by the universe expansion is a tiny amount compared to the increase in positive entropy (disorder). Sadly, this is always the case when we try to get "order out of order," as can be seen by studying entropy flows at different levels of emergent phenomena.
In any process, the positive entropy increase is always at least equal to, and generally orders of
magnitude larger than, the negative entropy in any created information structures, to satisfy the second law of thermodynamics. The positive entropy is named for Boltzmann, since it was his "H-Theorem" that proved entropy can only increase overall - the second law of thermodynamics. And negative entropy is called Shannon, since his theory of information communication has exactly the same mathematical formula as Boltzmann's famous principle;
S = k log W,
where S is the entropy, k is Boltzmann's constant, and W is the probability of the given state of the system.
Information flows into Boltzmann and Shannon Entropy
Material particles are the first information structures to form
in the universe.. They are quarks, baryons, and atomic nuclei,
which eventually combine with electrons to form atoms and eventually molecules,
when the falling temperature becomes low enough. These material particles are
attracted by the force of universal gravitation to form the gigantic
information structures of the galaxies, stars, and planets.
Cosmological information flows
Microscopic quantum mechanical particles and huge self-gravitating
systems are stable and have extremely long lifetimes,
thanks in large part to quantum stability.
Stars are another source of radiation, after the original Big Bang
cosmic source, which has cooled down to 3 degrees Kelvin (3°K)
and shines as the cosmic microwave background radiation.
Sun to Earth Entropy Flows
Our solar radiation has a high color temperature (5000K) and
a low energy-content temperature (273K). It is out of equilibrium
and it is the source of all the information-generating negative
entropy that drives biological evolution on the Earth. Note that
the fraction of the light falling on Earth is less than a billionth of
that which passes by and is lost in space.
A tiny fraction of the solar energy falling on the earth gets converted
into the information structures of plants and animals. Most
of it gets converted to heat and is radiated away as waste energy to
the night sky.
Entropy Flows into Life
Every biological structure is a quantum mechanical structure.
Quantum cooperative phenomena allow DNA to maintain its stable information structure over billions
of years in the constant presence of chaos and noise. And biological structures contain astronomical numbers of particles, allowing them to average over the random noise of individual quantum events, becoming "adequately determined."
Entropy Flows in a Human Being
The stable information content of a human being survives many
changes in the material content of the body during a person’s lifetime.
Only with death does the mental information (spirit, soul)
dissipate - unless it is saved somewhere.
The total mental information in a living human is orders of
magnitude less than the information content and information
processing rate of the body. But the cultural information structures created
by humans outside the body, in the form of external knowledge
like this book, and the enormous collection of human artifacts, now
rival the total biological information content.
The Shannon Principle - No Information Without Possibilities
In his development of the mathematical theory of the communication of information, Claude Shannon showed that there can be no new information in a message unless there are multiple possible messages. If only one message is possible, there is no information in that message.
We can simplify this to define the Shannon Principle. No new information can be created in the universe unless there are multiple possibilities, only one of which can become actual.
An alternative statement of the Shannon principle is that in a deterministic system, information is conserved, unchanging with time. Classical mechanics is a conservative system that conserves not only energy and momentum but also conserves the total information. Information is a "constant of the motion" in a determinist world.
Quantum mechanics, by contrast, is indeterministic. It involves irreducible ontological chance.
An isolated quantum system is described by a wave function ψ which evolves - deterministically - according to the unitary time evolution of the linear Schrödinger equation.
(ih/2π) ∂ψ/∂t = Hψ
The possibilities of many different outcomes evolve deterministically, but the individual actual outcomes are indeterministic.
This sounds a bit contradictory, but it is not. It is the essence of the highly non-intuitive quantum theory, which combines a deterministic "wave" aspect with an indeterministic "particle" aspect.
In his 1932 Mathematical Foundations of Quantum Mechanics, John von Neumann explained that two fundamentally different processes are going on in quantum mechanics (in a temporal sequence for a given particle - not at the same time).
Process 1. A non-causal process, in which the measured electron winds up randomly in one of the possible physical states (eigenstates) of the measuring apparatus plus electron.
The probability for each eigenstate is given by the square of the coefficients cn of the expansion of the original system state (wave function ψ) in an infinite set of wave functions φ that represent the eigenfunctions of the measuring apparatus plus electron.
cn = < φn | ψ >
This is as close as we get to a description of the motion of the "particle" aspect of a quantum system. According to von Neumann, the particle simply shows up somewhere as a result of a measurement.
Information physics says that the particle shows up whenever a new stable information structure is created, information that can be observed.
Process 1b. The information created in Von Neumann's Process 1 will only be stable if an amount of positive entropy greater than the negative entropy in the new information structure is transported away, in order to satisfy the second law of thermodynamics.
Process 2. A causal process, in which the electron wave function ψ evolves deterministically according to Schrödinger's equation of motion for the "wave"aspect. This evolution describes the motion of the probability amplitude wave ψ between measurements. The wave function exhibits interference effects. But interference is destroyed if the particle has a definite position or momentum. The particle path itself can never be observed.
Von Neumann claimed there is another major difference between these two processes. Process 1 is thermodynamically irreversible. Process 2 is in principle reversible. This confirms the fundamental connection between quantum mechanics and thermodynamics that is explainable by information physics.
Information physics establishes that process 1 may create information. It is always involved when information is created.
Process 2 is deterministic and information preserving.
The first of these processes has come to be called the collapse of the wave function.
It gave rise to the so-called problem of measurement, because its randomness prevents it from being a part of the deterministic mathematics of process 2.
But isolation is an ideal that can only be approximately realized. Because the Schrödinger equation is linear, a wave function | ψ > can be a linear combination (a superposition) of another set of wave functions | φn >,
| ψ > = ∑ cn | φn >,
where the cn coefficients squared are the probabilities of finding the system in the possible state | φn > as the result of an interaction with another quantum system.
cn2 = < ψ | φn >2.
Quantum mechanics introduces real possibilities, each with a calculable probability of becoming an actuality, as a consequence of one quantum system interacting (for example colliding) with another quantum system.
It is quantum interactions that lead to new information in the universe - both new information structures and information processing systems. But that new information cannot subsist unless a compensating amount of entropy is transferred away from the new information.
Even more important, it is only in cases where information persists long enough for a human being to observe it that we can properly describe the observation as a "measurement" and the human being as an "observer." So, following von Neumann's "process" terminology, we can complete his admittedly unsuccessful attempt at a theory of the measuring process by adding an anthropomorphic
Process 3 - a conscious observer recording new information in a mind. This is only possible if the local reductions in the entropy (the first in the measurement apparatus, the second in the mind) are both balanced by even greater increases in positive entropy that must be transported away from the apparatus and the mind, so the overall change in entropy can satisfy the second law of thermodynamics.
An Information Interpretation of Quantum Mechanics
Our emphasis on the importance of information suggests an "information interpretation" of quantum mechanics that eliminates the need for a conscious observer as in the "standard orthodox" Copenhagen Interpretation. An information interpretation dispenses also with the need for a separate "classical" measuring apparatus.
There is only one world, the quantum world. We can say it is ontologically indeterministic, but epistemically deterministic, because of human ignorance
Information physics claims there is only one world, the quantum world, and the "quantum to classical transition" occurs for any large macroscopic object with mass m that contains a large number of atoms. In this case, independent quantum events are "averaged over," the uncertainty in position and momentum of the object becomes less than the observational accuracy as Δv Δx > h / m and as h / m goes to zero.
The classical laws of motion, with their implicit determinism and strict causalityemerge when microscopic events can be ignored.
Information philosophy interprets the wave function ψ as a "possibilities" function. With this simple change in terminology, the mysterious process of a wave function "collapsing" becomes a much more intuitive discussion of possibilities, with mathematically calculable probabilities, turning into a single actuality, faster than the speed of light.
Information physics is standard quantum physics. It accepts the Schrödinger equation of motion, the principle of superposition, the axiom of measurement (now including the actual information "bits" measured), and - most important - the projection postulate of standard quantum mechanics (the "collapse" so many interpretations deny).
But a conscious observer is not required for a projection, for the wave-function "collapse", for one of the possibilities to become an actuality. What it does require is an interaction between (quantum) systems that creates irreversibleinformation.
In less than two decades of the mid-twentieth century, the word information was transformed from a synonym for knowledge into a mathematical, physical, and biological quantity that can be measured and studied scientifically.
In 1929, Leo Szilard connected an increase in thermodynamic (Boltzmann) entropy with any increase in information that results from a measurement, solving the problem of "Maxwell's Demon," a thought experiment suggested by James Clerk Maxwell, in which a local reduction in entropy is possible when an intelligent being interacts with a thermodynamic system.
In the early 1940s, digital computers were invented by von Neumann, Shannon, Alan Turing, and others. Their machines could run a stored program to manipulate stored data, processing information, as biological organisms had been doing for billions of years.
Then in the late 1940s, the problem of communicating digital data signals in the presence of noise was first explored by Shannon, who developed the modern mathematical theory of the communication of information. Norbert Wiener wrote in his 1948 book Cybernetics that "information is the negative of the quantity usually defined as entropy," and in 1949 Leon Brillouin coined the term "negentropy."
Finally, in the early 1950s, inheritable characteristics were shown by Francis Crick, James Watson, and George Gamow to be transmitted from generation to generation in a digital code.
Information is Immaterial
Information is neither matter nor energy, but it needs matter for its embodiment and energy for its communication.
A living being is a form through which passes a flow of matter and energy (with low entropy). Genetic information is used to build the information-rich matter into an information-processing structure that contains a very large number of hierarchically organized information structures.
All biological systems are cognitive, using their internal information structure to guide their actions. Even some of the simplest organisms can learn from experience. The most primitive minds are experience recorders and reproducers.
In humans, the information-processing structures create new actionable information (knowledge) by consciously and unconsciously reworking the experiences stored in the mind.
Emergent higher levels exert downward causation on the contents of the lower levels, ultimately supporting mental causation and free will.
When a ribosome assembles 330 amino acids in four symmetric polypeptide chains (globins), each globin traps an iron atom in a heme group at the center to form the hemoglobin protein. This is downward causal control of the amino acids, the heme groups, and the iron atoms by the ribosome. The ribosome is an example of Erwin Schrödinger's emergent "order out of order," life "feeding on the negative entropy" of digested food.
Notice the absurdity of the idea that the random motions of the transfer RNA molecules (green in the video above), each holding a single amino acid (red), are carrying pre-determined information of where they belong in the protein being built.
Determinism is an emergent property and an ideal philosophical concept, unrealizable except approximately in the kind of adequate determinism that we experience in the macroscopic world, where the determining information is part of the higher-level control system.
The total information in multi-cellular living beings can develop to be many orders of magnitude more than the information present in the original cell. The creation of this new information would be impossible for a deterministic universe, in which information is constant.
Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains.
Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history.
And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.
Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.
Since the 1950's, the science of human behavior has changed dramatically from a "black box" model of a mind that started out as a "blank slate" conditioned by environmental stimuli. Today's mind model contains many "functions" implemented with stored programs, all of them information structures in the brain. The new "computational model" of cognitive science likens the brain to a computer, with some programs and data inherited and others developed as appropriate reactions to experience.
The Experience Recorder and Reproducer
The brain should be regarded less as an algorithmic computer, with one or more central processing units addressing multiple data storage systems, than as a multi-channel and multi-track experience recorder and reproducer with an extremely high data rate. Information about an experience - the sights, sounds, smells, touch, and taste - is recorded along with the emotions - feelings of pleasure, pain, hopes, and fears - that accompany the experience. When confronted with similar experiences later, the brain can reproduce information about the original experience (an instant replay) that helps to guide current actions.
The ERR model stands in contrast to the popular cognitive science or “computational” model of a mind as a digital computer. No algorithms, data addressing schemes, or stored programs are needed for the ERR model.
The physical metaphor is a non-linear random-access data recorder, where data is stored using content-addressable memory (the memory address is the data content itself). Simpler than a computer with stored algorithms, a better technological metaphor might be a video and sound recorder, enhanced with the ability to record - and replay - smells, tastes, touches, and critically essential, feelings.
The biological model is neurons that wire together during an organism’s experiences, in multiple sensory and limbic systems, such that later firing of even a part of the wired neurons can stimulate firing of all or part of the original complex.
A conscious being is constantly recording information about its
perceptions of the external world, and most importantly for ERR,
it is simultaneously recording its feelings. Sensory data such as
sights, sounds, smells, tastes, and tactile sensations are recorded in
a sequence along with pleasure and pain states, fear and comfort
levels, etc.
All these experiential and emotional data are recorded in
association with one another. This means that when the experiences
are reproduced (played back in a temporal sequence), the
accompanying emotions are once again felt, in synchronization.
The ability to reproduce an experience is critical to learning
from past experiences, so as to make them guides for action in
future experiences. The ERR model is the minimal mind model
that provides for such learning by living organisms.
The ERR model does not need computer-like decision algorithms
to reproduce past experiences. All that is required is that
past experiences “play back” whenever they are stimulated by
present experiences that resemble the past experiences in one or
more ways.
Where neuroscientists have shown "neurons that fire together wire together," the ERR model of information philosophy simply is "neurons that have been wired together will fire together."
Neuroscientists and philosophers of mind have long asked how diverse signals from multiple locations in the brain over multiple pathways appear so unified in the brain. The ERR model offers a simple solution to this “binding” problem. Experiences are bound at their initial recording. They do not have to be re-associated by some central processing unit looking up where experiences may have been distributed among the various sensory or memory areas.
The ERR model may also throw some light on the problem of "qualia" and of "what it's like to be" a particular organism.
Information Philosophy and Modern Philosophy
Modern philosophy is a story about discovery of timeless truths, laws of nature, a block universe in which the future is a logical extension of the past, a primal moment of creation that starts a causal chain in which everything can be foreknown by an omniscient being. Modern philosophy seeks knowledge in logical reasoning with clear and unchanging concepts.
Its guiding lights are thinkers like Parmenides, Plato, and Kant, who sought unity and identity, being and universals.
Tradition, Modern, and Postmodern
In a traditional society, authoritative knowledge is that which has been handed down. Moderns are those who think that all knowledge must be based on reason. Postmoderns recognize that much knowledge has been invented, arbitrarily created
In modern philosophy, the total amount of information in the conceptually closed universe is static, a physical constant of nature. The laws of nature allow no exceptions, they are perfectly causal. Everything that happens is said to have a physical cause. This is called "causal closure". Chance and change - in a deep philosophical sense - are said to be illusions. Every event must have a cause, a reason.
Information philosophy, by contrast, is a story about invention, about novelty, about biological emergence and new beginnings unseen and unseeable beforehand, a past that is fixed but an ambiguous future that can be shaped by teleonomic changes in the present.
Its model thinkers are Heraclitus, Protagoras, Aristotle, and Hegel, for whom time, place, and particular situations mattered.
Information philosophy is built on probabilistic laws of nature. The fundamental challenge for information philosophy is to explain the emergence of stable information structures from primordial and ever-present chaos, to account for the phenomenal success of deterministic laws when the material substrate of the universe is irreducibly chaotic, noisy, and random, and to understand the concepts of truth, necessity, and certainty in a universe of chance, contingency, and indeterminacy.
Determinism and the exceptionless causal and deterministic laws of classical physics are the real illusions. Determinism is information-preserving. In an ideal deterministic Laplacian universe, the present state of the universe is implicitly contained in its earliest moments.
This ideal determinism does not exist. The "adequate determinism" behind the laws of nature emerged from the early years of the universe when there was only indeterministic chaos.
In a random noisy environment, how can anything be regular and appear determined? It is because the macroscopic consequences of the law of large numbers average out microscopic quantum fluctuations to provide us with a very adequate determinism.
Information Philosophy is an account of continuous information creation, a story about the origin and evolution of the universe, of life, and of intelligence from an original quantal chaos that is still present in the microcosmos. More than anything else, it is the creation and maintenance of stable information structures, despite the destructive entropic requirements of the second law of thermodynamics, that distinguishes biology from physics and chemistry.
Living things maintain information in a memory of the past that they can use to shape the future. The "meaning" in the information is their use of it. Some get their information "built-in" via heredity. Some learn it from experience. Others invent it!
Ancient Philosophy, before the advent of Modern Theology with John Duns Scotus and Thomas Aquinas, and Medieval Philosophy, before the beginning of Modern Philosophy with René Descartes, covered the same wide range of questions now addressable by Information Philosophy.
The Development of Information Philosophy
Our earliest work on information philosophy dates from the 1950's, based on suggestions made thirty years earlier by Arthur Stanley Eddington. In his 1928 Nature of the Physical World, Eddington argued that quantum indeterminacy had "opened the door of human freedom," and that the second law of thermodynamics might have some bearing on the question of objective good.
In the 1950's, we studied the then leading philosophies of positivism and existentialism.
Bertrand Russell, with the help of G. E. Moore, Alfred North Whitehead, and Ludwig Wittgenstein, proposed logic and language as the proper foundational basis, not only of philosophy, but also of mathematics and science. Wittgenstein's Tractatus imagined that a set of all true propositions could capture all the knowledge of modern science.
4.11 The totality of true propositions is the whole of natural science
(or the whole corpus of the natural sciences)
Their logical positivism and the variation called logical empiricism developed by Rudolf Carnap and the Vienna Circle proved to be failures in grounding philosophy, mathematics, or science.
On the continent, existentialism was the rage. We read Friedrich Nietzsche, Martin Heidegger, and Jean-Paul Sartre.
The existentialist continentals argued that freedom exists, but there are no objective values. The utilitarian English argued that values exist, but human freedom does not.
We wrote that "Values without freedom are useless. Freedom without values is absurd."
This was a chiasmos like the great figure of Immanuel Kant, rephrased by Charles Sanders Peirce as "Idealism without Materialism is Empty. Materialism without Idealism is Blind."
In the 1960's, we formulated arguments that cited "pockets of low entropy," in apparent violation of the second law, as the possible basis for anything with objective value. We puzzled over the origin of "negative entropy," since the universe was believed to have started in thermodynamic equilibrium and the second law of thermodynamics says that (positive) entropy can only increase.
In the late 1960's, we developed a two-stage model of free will and called it Cogito, a term often associated with the mind and with thought.
With deference to Descartes, the first modern philosopher, we called "negative entropy" Ergo. While thermodynamics calls it "negative," information philosophy sees it as the ultimate "positive" and deserving of a better name. We thought that Ergo etymologically suggests a fundamental kind of energy ("erg" zero), e.g., the "Gibbs free energy," G0, that is available to do work because it has low entropy.
In the early 70's, we decided to call the sum of human knowledge the Sum, to complete the triple wordplay on Descartes' proof of his existence.
We saw a great battle going on in the universe - between originary chaos and emergent cosmos. The struggle is between destructive chaotic processes that drive a microscopic underworld of random events versus constructive cosmic processes that create information structures with extraordinary emergent properties that include adequately determined scientific laws -
despite, and in many cases making use of, the microscopic chaos.
Since the destructive chaos is entropic, we repurposed a term from statistical mechanics and called the anti-entropic processes creating information structures ergodic. The embedded Ergod resonated.
Created information structures range from galaxies, stars, and planets, to molecules, atoms, and subatomic particles. They are the structures of terrestrial life from viruses and bacteria to sentient and intelligent beings. And they are the constructed ideal world of thought, of intellect, of spirit, including the laws of nature, in which we humans play a role as co-creator.
Information is constant in a deterministic universe. There is "nothing new under the sun." The creation of new information is not possible without the random chance and uncertainty of quantum mechanics, plus the extraordinary temporal stability of quantum mechanical structures.
It is of the deepest philosophical significance that information is based on the mathematics of probability. If all outcomes were certain, there would be no "surprises" in the universe. Information would be conserved and a universal constant, as some mathematicians mistakenly believe. Information philosophy requires the ontological uncertainty and probabilistic outcomes of modern quantum physics to produce new information.
But at the same time, without the extraordinary stability of quantized information structures over cosmological time scales, life and the universe we know would not be possible. That stability is the consequence of an underlying digital nature. Quantum mechanics reveals the architecture of the universe to be discrete rather than continuous, to be digital rather than analog. Digital information transfers are essentially perfect. All analog transfers are "lossy."
Moreover, the "correspondence principle" of quantum mechanics and the "law of large numbers" of statistics ensures that macroscopic objects can normally average out microscopic uncertainties and probabilities to provide the "adequate determinism" that shows up in all our "Laws of Nature."
Information philosophy explores some classical problems in philosophy with deeper and more fundamental insights than is possible with the logic and language approach of modern analytic philosophy.
By exploring the origins and evolution of structure in the universe, information philosophy transcends humanity and even life itself, though it is not a mystical metaphysical transcendence.
Information philosophy uncovers the creative process working in the universe
to which we owe our existence, and therefore perhaps our reverence for its "providence".
Information philosophy locates the fundamental source of all values not in humanity ("man the measure"), not in bioethics ("life the ultimate good"), but in the origin and evolution of information in the cosmos.
Information philosophy is an idealistic philosophy, a process philosophy, and a systematic philosophy, the first in many decades. It provides important new insights into the Kantian transcendental problems of epistemology, ethics, freedom of the will, god, and immortality, as well as the mind-body problem, consciousness, and the problem of evil.
In physics, information philosophy (or information physics) provides new insights into the problem of measurement, the paradox of Schrödinger's Cat, the two paradoxes of microscopic reversibility and macroscopic recurrence that Josef Loschmidt and Ernst Zermelo used to criticize Ludwig Boltzmann's explanation of the entropy increase required by the second law of thermodynamics, and finally information provides a better understanding of the entanglement and nonlocality phenomena that are the basis for modern quantum cryptography and quantum computing.
Finally, a new philosophy of biology should be based on the deep understanding of organisms as information users, information creators, information communicators, and at the higher levels, information processors, including humans who have learned to store information externally and transfer it between the generations culturally. Except for organisms that can extract information by photosynthesis of the negative entropy (free or available energy) streaming from the sun, most living things destroy other cells to extract the information needed to maintain their own low entropy state of organization. Most life feeds on other life.
And most life communicates with other life. Even single cells, before the emergence of multicellular organisms, developed communication systems between the cells that are still visible in slime molds and social amoebae today. In a multicellular organism, every cell has some level of communication with all the others. Most higher level organisms share communal information that makes them stronger as a social group than as independent individuals. The sum of human knowledge has amplified the power of humanity, for better or worse, to a level that can control the environmental conditions on all of planet Earth.
Information biology is the hypothesis that all biological evolution should be viewed primarily as the development of more and more powerful users, creators, and communicators of information. Seen though the lens of information, humans are the current end product of information processing systems. With the emergence of life, purpose (telos) appeared in the universe. The teleonomic goal of each cell is to become two cells, which replicates its information content. The purpose of each species is to improve its reproductive success relative to other populations. The purpose of human populations then is to use, to add to, and to communicate human knowledge in order to maximize the human capital per person.
Like love, the information that is shared by educating others is not used up. Information is not a scarce economic good. The more that information is communicated, the more of it there is, in human minds (not brains), and in the external stores of knowledge. These are books of course, but in the future they will be the interconnected knowledge bases of the world wide web, including www.informationphilosopher.com.
The first thing we must do for the young is to teach them how to teach themselves by accessing these knowledge systems with handheld devices that will some day be available for all the world's children, beyond one laptop per child to one smartphone per child.
Based on insights into the discovery of the cosmic creation process, the Information Philosopher proposes three primary ideas that are new approaches to perennial problems in philosophy. They are likely to change some well-established philosophical positions. Even more important, they may reconcile idealism and materialism and provide a new view of how humanity fits into the universe.
The three ideas are
An explanation or epistemological model of knowledge formation and communication. Knowledge and information are neither matter nor energy, but they require matter for expression and energy for communication. They seem to be metaphysical.
Briefly, we identify knowledge with actionable information in the brain-mind. We justify knowledge by behavioral studies that demonstrate the existence of information structures implementing functions in the brain. And we verify knowledge scientifically.
A basis for objective value, a metaethics beyond humanism and bioethics, grounded in the fundamental information creation processes behind the structure and evolution of the universe and the emergence of life.
Briefly, we find positive value (or good) in information structures. We see negative value (or evil) in disorder and entropy tearing down such structures. We call energy with low entropy "Ergo" and call anti-entropic processes "ergodic." We recognize that "ergodic" is itself too esoteric and thus not likely to be widely accepted. Perhaps the most positive term for what we value is just "information" itself!
Our first categorical imperative is then "act in such a way as to create, maintain, and preserve information as much as possible against destructive entropic processes."
Our second ethical imperative is "share knowledge/information to the maximum extent." Like love, our own information is not diminished when we share it with others
Our third moral imperative is "educate (share the knowledge of what is right) rather than punish." Knowledge is virtue. Punishment wastes human capital and provokes revenge.
Briefly, we separate "free" and "will" in a two-stage process - first the free generation of alternative possibilities for action (which creates new information), then an adequately determined decision by the will. We call this two-stage view our Cogito model and trace the idea of a two-stage model in the work of two dozen thinkers back to William James in 1884.
This model is a synthesis of adequate determinism and limited indeterminism, a coherent and completecompatibilism that reconciles free will with both determinism and indeterminism.
David Hume thought he had reconciled freedom with determinism. We reconcile free will with indeterminism and an "adequate" determinism.
Because it makes free will compatible with both a form of determinism (really determination) and with an indeterminism that is limited and controlled by the mind, the leading libertarian philosopher Bob Kane suggested we call this model "Comprehensive Compatibilism."
The problem of free will cannot be solved by logic, language, or even by physics. Man is not a machine and the mind is not a computer. Free will is a property of a biophysical information processing system.
All three ideas depend on understanding modern cosmology, physics, biology, and neuroscience, but especially the intimate connection between quantum mechanics and the second law of thermodynamics that allows for the creation of new information structures.
All three are based on the theory of information, which alone can establish the existential status of ideas, not just the ideas of knowledge, value, and freedom, but other-worldly speculations in natural religion like God and immortality.
All three have been anticipated by earlier thinkers, but can now be defended on strong empirical grounds. Our goal is less to innovate than to reach the best possible consensus among philosophers living and dead, an intersubjective agreement between philosophers that is the surest sign of a knowledge advance.
This Information Philosopher website aims to be an open resource for the best thinking of philosophers and scientists on these three key ideas and a number of lesser ideas that remain challenging problems in philosophy - on which information philosophy can shed some light.
Among these are the mind-body problem (the mind can be seen as the realm of information in its free thoughts, the body an adequately determined biological system creating and maintaining information); the common sense intuition of a cosmic creative process often anthropomorphized as a God or divine Providence; the problem of evil (chaotic entropic forces are the devil incarnate); and the "hard problem" of consciousness (agents responding to their environment, and originating new causal chains, based on information processing).
Philosophy is the love of knowledge or wisdom. Information philosophy (I-Phi or ΙΦ) qualifies and quantifies knowledge as meaningful actionable information. Information philosophy reifies information as an immaterial entity that has causal power over the material world!
What is information that merits its use as the foundation of a new method of inquiry?
Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and energy for its communication. Information is the modern spirit, the ghost in the machine. It is the stuff of thought, the immaterial substance of philosophy.
Information is a powerful diagnostic tool. It is a better abstract basis for philosophy, and for science as well, especially physics, biology, and neuroscience. It is capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), and idealism itself.
Information philosophy is now more than the solution to three fundamental problems we identified in the 1960's and '70's. I-Phi is a new philosophical method, capable of solving multiple problems in both philosophy and physics. It needs young practitioners, presently tackling some problem, who might investigate the problem using this new methodology.
Note that, just as the philosophy of language is not linguistic philosophy, Information philosophy is not the philosophy of information, which is mostly about computers and cognitive science, the computational theory of mind.
Philosophers like Ludwig Wittgenstein labeled many of our problems “philosophical puzzles.” Bertrand Russell called them “pseudo-problems.” Analytic language philosophers thought many of these problems could be “dis-solved,” revealing them to be conceptual errors caused by the misuse of language.
Information philosophy takes us past logical puzzles and language games, not by diminishing philosophy and replacing it with science.
Russell insisted that
“questions which are already capable of definite answers are placed in the sciences, while those only to which, at present, no definite answer can be given, remain to form the residue which is called philosophy.”
(The Problems of Philosophy, 1912, p.155)
Information philosophy aims to show that problems in philosophy should not be reduced to “Russell’s Residue.”
The language philosophers of the twentieth century thought that they could solve (or at least dis-solve) the classical problems of philosophy. They did not succeed. Information philosophy, by comparison, now has cast a great deal of light on some of those problems. It needs more information philosophers to join us to make more progress.
To recap, when information is stored in any structure, two fundamental physical processes occur. First is a "collapse" of a quantum mechanical wave function, reducing multiple possibilities to a single actuality. Second is a local decrease in the entropy corresponding to the increase in information. Entropy greater than that must be transferred away from the new information structure to satisfy the second law of thermodynamics.
These quantum level processes are susceptible to noise. Information stored may have errors. When information is retrieved, it is again susceptible to noise. This may garble the information content. In information science, noise is generally the enemy of information. But some noise is the friend of freedom, since it is the source of novelty, of creativity and invention, and of variation in the biological gene pool.
Biological systems have maintained and increased their invariant information content over billions of generations, coming as close to immortality as living things can. Philosophers and scientists have increased our knowledge of the external world, despite logical, mathematical, and physical uncertainty. They have created and externalized information (knowledge) that can in principle become immortal. Both life and mind create information in the face of noise. Both do it with sophisticated error detection and correction schemes. The scheme we use to correct human knowledge is science, a two-stage combination of freely invented theories and adequately determined experiments. Information philosophy follows that example.
If you have read this far, you probably already know that the Information Philosopher website itself is an exercise in information sharing. It has seven parts, each with multiple chapters. Navigation at the bottom of each page will take you to the next or previous part or chapter.
Teacher and Scholar links display additional material on some pages, and reveal hidden footnotes on some pages. The footnotes themselves are in the Scholar section.
Our goal is for the website to contain all the great philosophical discussions of the three original problem areas we identified in the 1970's - COGITO (freedom), ERGO (value), and SUM (knowledge) - plus potential solutions for several classic problems in philosophy and physics, many of which had been designated "pseudo-problems" or relegated to "metaphysics."
We have now shown that information philosophy is a powerful diagnostic tool for addressing metaphysical problems. See The Metaphysicist.
In the left-hand column of all I-Phi pages are links to nearly three hundred philosophers and scientists who have made contributions to these great problems. Their web pages include the original contributions of each thinker, with examples of their thought, usually in their own words, and where possible in their original languages as well.
All original content on Information Philosopher is available for your use, without requesting permission, under a Creative Commons Attribution License.
Copyrights for all excerpted and quoted works remain with their authors and publishers.
A web page may contain two extra levels of material. The Normal page is material for newcomers and students of the Information Philosophy. Two hidden levels contain material for teachers (e.g., secondary sources) and for scholars (e.g., footnotes, and original language quotations).
Teacher materials on a page will typically include references to secondary sources and more extended explanations of the concepts and arguments. Secondary sources will include books, articles, and online resources. Extended explanations should be more suitable for teaching others about the core philosophical ideas, as seen from an information perspective.
For Scholars
To hide this material, click on the Teacher or Normal link.
Scholarly materials will generally include more primary sources, more in-depth technical and scientific discussions where appropriate, original language versions of quotations, and references to all sources.
Footnotes for a page appear in the Scholar materials. The footnote indicators themselves are only visible in Scholar mode.