"The question now is, how does it really work? What machinery is actually producing this thing? Nobody knows any machinery."As I see it, the one mystery is how quantum waves of abstract and immaterial information can cause the motions and create the properties of concrete particles. An information explanation of the cosmic creation process shows how the expansion of the universe opened up new possibilities for different possible futures. My work is based on suggestions made by Arthur Stanley Eddington in the 1930's and by my Harvard colleague David Layzer in the 1970's. Information philosophy has shown that novelty in the universe ("something new under the sun") requires a temporal process that depends first on the existence of new possibilities and then second on the selection or choice of one possibility. The first step also decreases the entropy locally, requiring a compensating increase globally to satisfy the second law of thermodynamics. These two steps or two stages explain not only our fundamental cosmological question, but also two other great problems in science and in philosophy, the two-step process of biological evolution and the two-stage model of freedom of the human will. Finally, Claude Shannon's theory of the communication of information involves these two steps or stages (see the Shannon principle). The amount of information communicated (or created in our cases) depends on the number of possible messages. With eight possible messages, Shannon says one actual message communicates three bits of information (23 = 8). Multiple possible messages correspond to multiple possible futures. If there is only one possibility, there is only one possible future. Some scientists (e.g., Seth Lloyd) think that the total information in the universe is a conserved quantity, just like the conservation of energy and matter. The "block universe" of special relativity and four-dimensional space-time is interpreted by some as the one possible future that is "already out there." This flawed idea of a fixed amount of information in the universe supports the idea of Laplace's demon, a super-intelligent being who knows the positions and velocities of all the particles in the universe, one who could use Newton's laws of classical mechanics to know all the past and future of the universe. Such a universe is known as deterministic, pre-determined by the information at the start of the universe, or pre-ordained by an agent who created the universe. This conservation of total information since time zero also supports the much older idea of an omniscient and omnipotent God with foreknowledge of the future, which threatens the idea of human free will. Logically speaking, a god can not be both omniscient and omnipotent. In our study of how Albert Einstein invented most of quantum mechanics a decade before Werner Heisenberg, we showed that Einstein saw the existence of ontological chance whenever electromagnetic radiation interacts with atoms and molecules. This means that many future events, like Aristotle's famous "sea battle," are irreducibly contingent. Future events cannot be known until that future time when they either do or do not occur. The statement "the sea-battle will occur tomorrow" is neither true nor false, challenging Aristotle's bivalent logic and the "excluded middle." A contingent future means an omniscient being can not exist. The indeterminism of quantum mechanics invalidates the idea of physical determinism as well as the idea of an omniscient being. Our work on free will limits indeterminism to the first "free" stage, where it helps to generate alternative possibilities (new thoughts), and our model requires an adequate determinism in the second "will" stage, to ensure that our actions are caused by our motives, desires, and feelings. First "free", then "will." The great scientist and philosopher of biology Ernst Mayr described evolution as a "two-step process" involving chance.
Evolutionary change in every generation is a two-step process: the production of genetically unique new individuals and the selection of the progenitors of the next generation. The important role of chance at the first step, the production of variability, is universally acknowledged, but the second step, natural selection, is on the whole viewed rather deterministically: Selection is a non-chance process.
And I can easily show...that as a matter of fact the new conceptions, emotions, and active tendencies which evolve are originally produced in the shape of random images, fancies, accidental out-births of spontaneous variation in the functional activity of the excessively instable human brain, which the outer environment simply confirms or refutes, adopts or rejects, preserves or destroys, - selects, in short, just as it selects morphological and social variations due to molecular accidents of an analogous sort.In my 2011 book Free Will, I report over two-dozen other philosophers and scientists who independently invented this two-stage model of free will, both before and after my independent idea while a graduate student at Harvard in the 1970's.
"a molecule of total spin zero consisting of two atoms, each of spin one-half. The two atoms are then separated by a method that does not influence the total spin. After they have separated enough so that they cease to interact, any desired component of the spin of the first particle (A) is measured. Then, because the total spin is still zero, it can immediately be concluded that the same component of the spin of the other particle (B) is opposite to that of A."An experimental apparatus to realize Bohm's proposal emits entangled particles in opposite directions. The particles could be Bohm's atoms, or simply electrons heading toward Stern-Gerlach devices that measure electron spins as up or down. The spin of each particle can be up or down, so a measurement provides a single bit of information, 1 or 0. We traditionally give the experimenters measuring particles A and B the names Alice and Bob. When Alice observes an up particle, she gets the bit 1. Instantly, Bob observes a down particle and gets the bit 0. These are well-established experimental facts. Once the two particles have separated to a great distance, the naive theory is that one bit of information about Alice's up particle must have been "communicated" to Bob's particle so it can quickly adjust its spin to down. Or more simply, that a "hidden variable" travels at faster than light speed to "act" on particle B, causing it to have spin down. These theories are flawed, leading to claims that entanglement "connects" everything instantly in a "holistic" universe. My Ph.D. thesis at Harvard in the 1960's solved the Schrödinger equation for the wave function Ψ12 of a two-atom hydrogen molecule, exactly what David Bohm proposed in the 1950's to explain "hidden variables." This provided me with a great insight into entanglement. Before a measurement, quantum objects like hydrogen atoms, electrons, or photons have two possible spin states. A measurement makes one of the two possible states an actual state. The founders of quantum mechanics described this process in terms that remain controversial a century later. Werner Heisenberg said the new state is indeterministic and comes randomly into "existence." Einstein hoped for a return to classical deterministic physics, in which objects have a "real" existence before we measure them. The quantum mechanical wave function describing a single-particle spin is a linear combination (also called a superposition) of an up state [
CAT |
|
"as soon as definite knowledge concerning any subject becomes possible, this subject ceases to be called philosophy, and becomes a separate science...while those only to which, at present, no definite answer can be given, remain to form the residue which is called philosophy."This information philosopher thinks not. In order for problems to remain to remain philosophical, interested philosophers should consider our proposed information-based solutions as part of the philosophical dialogue. Among the proposed solutions to classic philosophical problems are:
A common definition of information is the act of informing - the communication of knowledge from a sender to a receiver that informs (literally shapes) the receiver. Often used as a synonym for knowledge, information traditionally implies that the sender and receiver are human beings, but many animals clearly communicate. Information theory studies the communication of information. Information philosophy extends that study to the communication of information content between material objects, including how it is changed by energetic interactions with the rest of the universe. We call a material object with information content an information structure. While information is communicated between inanimate objects, they do not process information, which we will show is the defining characteristic of living beings and their artifacts. The sender of information need not be a person, an animal, or even a living thing. It might be a purely material object, a rainbow, for example, sending color information to your eye. The receiver, too, might be merely physical, a molecule of water in that rainbow that receives too few photons and cools to join the formation of a crystal snowflake, increasing its information content. Information theory, the mathematical theory of the communication of information, says little about meaning in a message, which is roughly the use to which the information received is put. Information philosophy extends the information flows in human communications systems and digital computers to the natural information carried in the energy and material flows between all the information structures in the observable universe. A message that is certain to tell you something you already know contains no new information. It does not increase your knowledge, or reduce the uncertainty in what you know, as information theorists put it. If everything that happens was certain to happen, as determinist philosophers claim, no new information would ever enter the universe. Information would be a universal constant. There would be "nothing new under the sun." Every past and future event could in principle be known by a god-like super-intelligence with access to that fixed totality of information (Laplace's Demon). Physics tells us that the total amount of mass and energy in the universe is a constant. The conservation of mass and energy is a fundamental law of nature. Some mathematical physicists erroneously think that information should also be a conserved quantity, that information is a constant of nature. This includes some leading mathematical physicists. But information is neither matter nor energy, though it needs matter to be embodied and available energy to be communicated. Information can be created and destroyed. The material universe creates it. The biological world creates it and utilizes it. Above all, human minds create, process, and preserve abstract information, the Sum of human knowledge that distinguishes humanity from all other biological species and that provides the extraordinary power humans have over our planet, for better or for worse. Information is the modern spirit, the ghost in the machine, the mind in the body. It is the soul, and when we die, it is our information that perishes. The matter remains. We propose information as an objective value, the ultimate sine qua non. Information philosophy claims that man is not a machine and the brain is not a computer. Living things process information in ways far more complex, if not faster, than the most powerful information processing machines. What biological systems and computing systems have in common is the processing of information, as we must explain. Whereas machines are assembled, living things assemble themselves. They are both information structures, patterns, through which matter and energy flows, thanks to flows of negative entropy (available energy) coming from the Sun and the expanding universe. And they both can create new information, build new structures, and maintain their integrity against the destructive influence of the second law of thermodynamics with its increasing positive entropy or disorder. Biological evolution began when the first molecule replicated itself, that is, duplicated the information it contained. But duplication is mere copying. Biological reproduction is a much more sophisticated process in which the germ or seed information of a new living thing is encoded in a data or information structure (a genetic code) that can be communicated to processing systems that produce another instance of the given genus and species. Ontologically random imperfections, along with the deliberate introduction of random noise, for example in sexual recombinations, in the processing systems produce the variations that are selected by evolution based on their reproductive success. Errors are not restricted to the genetic code, occurring throughout the development of each individual up to the present. Cultural evolution is the creation and communication of new information that adds to the sum of human knowledge. The creation and evolution of information processing systems in the universe has culminated in minds that can understand and reflect on what we call the cosmic creation process.How is information created?
Ex nihilo, nihil fit, said the ancients, Nothing comes from nothing. But information is no (material) thing. Information is physical, but it is not material. Information is a property of material. It is the form that matter can take. We can thus create something (immaterial) from nothing! But we shall find that it takes a special kind of energy (free or available energy, with negative entropy) to do so, because it involves the rearrangement of matter. Energy transfer to or from an object increases or decreases the heat in the object. Entropy transfer does not change the heat content, it represents only a different organization or distribution of the matter in the body. Increasing entropy represents a loss of organization or order, or, more precisely, information. Maximum entropy is maximum disorder and minimal information. As you read this sentence, new information is (we hope) being encoded/embodied in your mind/brain. Permanent changes in the synapses between your neurons store the new information. New synapses are made possible by free energy and material flows in your metabolic system, a tiny part of the negative entropy flows that are coursing throughout the universe. Information philosophy will show you how these tiny mental flows allow you to comprehend and control at least part of the cosmic information flows in the universe. Cosmologists know that information is being created because the universe began some thirteen billion years ago in a state of minimal information. The "Big Bang" started with the most elementary particles and radiation. How matter formed into information structures, first elementary particles, then atoms, then the galaxies, stars, and planets, is the beginning of a story that ends with human minds emerging to understand our place in the universe. The relation between matter and information is straightforward. The embodied information is the organization or arrangement of the matter plus the laws of nature that describe the motions of matter in terms of the fundamental forces that act between all material particles. The relation between information and energy is more complex, and has led to confusion about how to apply mathematical information theory to the physical and biological sciences. Material systems in an equilibrium state are maximally disordered, have maximum entropy, no negative entropy, and no information other than the bulk parameters of the system. In the case of the universe, the initial parameters were very few, the amount of radiant energy (the temperature) and the number of elementary particles (quarks, gluons, electrons, and photons) per unit volume, and the total volume (infinite?). These parameters, and their changes (as a function of time, as the temperature falls) are all the information needed to describe a statistically uniform, isotropic universe and its evolution. Information philosophy will explain the process of information creation in three fundamental realms - the purely material, the biological, and the mental. The first information creation was a kind of "order out of chaos," when matter in the early universe opened up spaces allowing gravitational attraction to condense otherwise randomly distributed matter into highly organized galaxies, stars, and planets. It was the expansion - the increasing space between material objects - that drove the universe away from thermodynamic equilibrium (maximum entropy and disorder) and in some places created negative entropy, a quantitative measure of orderly arrangements that is the basis for all information. Purely material objects react to one another following laws of nature, but they do not in an important sense create or process the information that they contain. It was the expansion, moving faster than the re-equilibration time, and the four natural forces, especially gravitation, that were responsible for the new structures. A qualitatively different kind of information creation was when the first molecule on earth to replicate itself went on to duplicate its information exponentially. Here the prototype of life was the cause for the creation of the new information structure. Accidental errors in the duplication provided variations in replicative success. Most important, besides creating their information structures, biological systems are also information processors. Living things use information to guide their actions. With the appearance of life, agency and purpose appeared in the universe. Although some philosophers hold that life just gives us the "appearance of purpose." The third process of information creation, and the most important to philosophy, is human creativity. Almost every philosopher since philosophy began has considered the mind as something distinct from the body. Information philosophy can now explain that distinction. The mind can be considered the immaterial information in the brain. The brain, part of the material body, is a biological information processor. The stuff of mind is the information being processed and the new information being created. As some philosophers have speculated,Why is information better than logic and language for solving philosophical problems?
mind is the software in the brain hardware. Most material objects are passive information structures. Living things are information structures that actively process information. They communicate it between their parts to build, maintain, and repair their (material) information structure, through which matter and energy flow under the control of the information structure itself. Resisting the second law of thermodynamics locally, living things increase entropy globally much faster than non-living things. But most important, living things increase their information content as they develop. Humans learn from their experiences, storing knowledge in an experience recorder and reproducer (ERR). Mental things (ideas) are pure abstractions from the material world, but they have control (downward causation) over the material and biological worlds. This enables agent causality. Human minds create information structures, but their unique creation is the collection of abstract ideas that are the sum of human knowledge. It is these ideas that give humanity unparalleled extraordinary control over the material and biological worlds. It may come as a surprise for many thinkers to learn that the physics involved in the creation of all three types of information - material, biological, and mental - include the same two-step sequence of quantum physics and thermodynamics at the core of the cosmic creation process. The most important information created in a mind is a recording of an individual's experiences (sensations). Recordings are played back (automatically and perhaps mostly unconsciously) as a guide to evaluate future actions (volitions) in similar situations. The particular past experiences reproduced are those stored in the brain located near elements of the current experience (association of ideas).
Just as neurons that fire together wire together, neurons that have been wired together will later fire together. Sensations are recorded as the mental effects of physical causes.
Sensations are stored as retrievable information in the mind of an individual self. Recordings include not only the five afferent senses but also the internal emotions - feelings of pleasure, pain, hopes, and fears - that accompany an experience. They constitute "what it's like" for a particular being to have an experience. Volitions are the mental causes of physical effects.
Volitions begin with 1) the reproduction of past experiences that are similar to the current experience. These become thoughts about possible actions and the (partly random) generation of other alternative possibilities for action. They continue with 2) the evaluation of those freely generated thoughts followed by a willful selection (sometimes habitual) of one of those actions. Volitions are followed by 3) new sensations coming back to the mind indicating that the self has caused the action to happen (or not). This feedback is recorded as further retrievable information, reinforcing the knowledge stored in the mind that the individual self can cause this kind of action (or sometimes not). Many philosophers and most scientists have held that all knowledge is based on experience. Experience is ultimately the product of human sensations, and sensations are just electrical and chemical interactions with human skin and sense organs. But what of knowledge that is claimed to be mind-independent and independent of experience?
Broadly speaking, modern philosophy has been a search for truth, for a priori, analytic, certain, necessary, and provable truth. But all these concepts are mere ideas, invented by humans, some aspects of which have been discovered to be independent of the minds that invented them, notably formal logic and mathematics. Logic and mathematics are systems of thought, inside which the concept of demonstrable (apodeictic) truth is useful, but with limits set by Kurt Gödel's incompleteness theorem. The truths of logic and mathematics appear to exist "outside of space and time." Gottfried Leibniz called then "true in all possible worlds," meaning their truth is independent of the physical world. We call them a priori because their proofs are independent of experience, although they were initially abstracted from concrete human experiences. Analyticity is the idea that some statements, propositions in the form of sentences, can be true by the definitions or meanings of the words in the sentences. This is correct, though limited by verbal difficulties such as Russell's paradox and numerous other puzzles and paradoxes. Analytic language philosophers claim to connect the words with objects, material things, and thereby tell us something about the world. Some modal logicians (cf. Saul Kripke) claim that words that are names of things are necessary a posteriori, "true in all possible worlds." But this is nonsense, because we invented all those words and worlds. They are mere ideas. Perhaps the deepest of all these philosophical ideas is necessity. Information philosophy can now tell us that there is no such thing as absolute necessity. There is of course an adequate determinism in the macroscopic world that explains the appearance of deterministic laws of nature, of cause and effect, for example. This is because macroscopic objects consist of vast numbers of atoms and their individual random quantum events average out. But there is no metaphysical necessity. At the fundamental microscopic level of material reality, there is an irreducible contingency and indeterminacy. Everything that we know, everything we can say, is fundamentally empirical, based on factual evidence, the analysis of experiences that have been recorded in human minds. So information philosophy is not what we can logically know about the world, nor what we can analytically say about the world, nor what is necessarily the case in the world. There is nothing that is the case that is necessary and perfectly determined by logic, by language, or by the physical laws of nature. Our world and its future are open and contingent, with possibilities that are the source of new information creation in the universe and source of human freedom. For the most part, philosophers and scientists do not believe in ontological possibilities, despite their invented "possible worlds," which are on inspection merely multiple "actual worlds." They are "actualists." This is because they cannot accept the idea of ontological chance. They hope to show that the appearance of chance is the result of human ignorance, that chance is merely an epistemic phenomenon. Now chance, like truth, is just another idea, just some more information. But what an idea! In a self-referential virtuous circle, it turns out that without the real possibilities that result from ontological chance, there can be no new information. Information philosophy offers cosmological and biological evidence for the creation of new information in the universe. So it follows that chance is real, fortunately something that we can keep under control. We are biological beings that have evolved, thanks to chance, from primitive single-cell communicating information structures to multi-cellular organisms whose defining aspect is the creation and communication of information. The theory of communication of information is the foundation of our "information age." To understand how we know things is to understand how knowledge represents the material world of embodied "information structures" in the mental world of immaterial ideas. All knowledge starts with the recording of experiences. The experiences of thinking, perceiving, knowing, believing, feeling, desiring, deciding, and acting may be bracketed by philosophers as "mental" phenomena, but they are no less real than other "physical" phenomena. They are themselves physical phenomena.
They are just not material things. Information philosophy defines human knowledge as immaterial information in a mind, or embodied in an external artifact that is an information structure (e.g., a book), part of the sum of all human knowledge. Information in the mind about something in the external world is a proper subset of the information in the external object. It is isomorphic to a small part of the total information in or about the object. The information in living things, artifacts, and especially machines, consists of much more than the material components and their arrangement (positions over time). It also consists of all the information processing (e.g., messaging) that goes on inside the thing as it realizes its entelechy or telos, its internal or external purpose. All science begins with information gathered from experimental observations, which are themselves mental phenomena. Observations are experiences recorded in minds. So all knowledge of the physical world rests on the mental. All scientific knowledge is information shared among the minds of a community of inquirers. As such, science is a collection of thoughts by thinkers, immaterial and mental, some might say fundamental. Recall Descartes' argument that the experience of thinking is that which for him is the most certain.Information philosophy is not the philosophy of information (the intersection of computer science, information science, information technology, and philosophy), just as linguistic philosophy - the idea that linguistic analysis can solve (or dis-solve) philosophical problems - is not the philosophy of language. Compare the philosophy of mathematics, philosophy of biology, etc.The analysis of language, particularly the analysis of philosophical concepts, which dominated philosophy in the twentieth century, has failed to solve the most ancient philosophical problems. At best, it claims to "dis-solve" some of them as conceptual puzzles. The "problem of knowledge" itself, traditionally framed as "justifying true belief," is recast by information philosophy as the degree of isomorphism between the information in the physical world and the information in our minds. Information psychology can be defined as the study of this isomorphism. We shall see how information processes in the natural world use arbitrary symbols (e.g., nucleotide sequences) to refer to something, to communicate messages about it, and to give the symbol meaning in the form of instructions for another process to do something (e.g., create a protein). These examples provide support for both theories of meaning as reference and meaning as use. Note that just as language philosophy is not the philosophy of language, so information philosophy is not the philosophy of information. It is rather the use of information as a tool to study philosophical problems, some of which are today yielding tentative solutions. It is time for philosophy to move beyond logical puzzles and language games.
Given the second law of thermodynamics, which says that any system will over time approach a thermodynamic equilibrium of maximum disorder or entropy, in which all information is lost, and given the best current model for the origin of the universe, which says everything began in a state of thermodynamic equilibrium some 13.75 billion years ago, how can it be that living beings are creating and communicating vast amounts of new information every day?None of these processes can work unless they have a way to get rid of the positive entropy (disorder) and leave behind a pocket of negative entropy (order or information). The positive entropy is either conducted, convected, or radiated away as waste matter and energy, as heat, or as pure radiation. At the quantum level, it is always the result of interactions between matter and radiation (photons). Whenever photons interact with material particles, the outcomes are inherently unpredictable. As Albert Einstein discovered ten years before the founding of quantum mechanics, these interactions involve irreducible ontological chance. Negative entropy is an abstract thermodynamic concept that describes energy with the ability to do work, to make something happen. This kind of energy is often called free energy or available energy. In a maximally disordered state (called thermodynamic equilibrium) there can be matter in motion, the motion we call heat. But the average properties - density, pressure, temperature - are the same everywhere. Equilibrium is formless. Departures from equilibrium are when the physical situation shows differences from place to place. These differences are information. The second law of thermodynamics then simply means that isolated systems will eliminate differences from place to place until all properties are uniformly distributed. Natural processes spontaneously destroy information. Consider the classic case of what happens when we open a perfume bottle.
Why are we not still in that original state of equilibrium? Broadly speaking, there are three major phenomena or processes that can reduce the entropy locally, while of course increasing it globally to satisfy the second law of thermodynamics. Two of these do it "blindly," the third does it with a built-in "purpose," or telos."
- Universal Gravitation
- Quantum Cooperative Phenomena (e.g., crystallization, the formation of atoms and molecules)
- Life
When the universe expands, say grows to ten times its volume, it is just like the perfume bottle opening. The matter particles must redistribute themselves to get back to equilibrium. But suppose the universe expansion rate is much faster than the equilibration or relaxation time. The universe is out of equilibrium, and in a flat, ever-expanding, universe it will never get back!
In the earliest moments of the universe, material particles were in equilibrium with radiation at extraordinarily high temperatures. When quarks formed neutrons and protons, they were short-lived, blasted back into quarks by photon collisions. As the universe expanded, the temperature cooled, the space per photon increased and the mean free time between photon collisions increased, giving larger particles a better chance to survive. The expansion red-shifted the photons. decreasing the average energy per photon, and eventually reducing the number of high energy photons that disassociate matter. The mean free path of photons was very short. They were being scattered by collisions with electrons. When temperature declined further, to 5000 degrees, about 400,000 years after the "Big Bang," the electrons and protons combined to make hydrogen and (with neutrons) helium atoms.It gave rise to the so-called problem of measurement, because its randomness prevents it from being a part of the deterministic mathematics of process 2.
But isolation is an ideal that can only be approximately realized. Because the Schrödinger equation is linear, a wave function | ψ > can be a linear combination (a superposition) of another set of wave functions | φn >,Process 3 - a conscious observer recording new information in a mind. This is only possible if the local reductions in the entropy (the first in the measurement apparatus, the second in the mind) are both balanced by even greater increases in positive entropy that must be transported away from the apparatus and the mind, so the overall change in entropy can satisfy the second law of thermodynamics.
Briefly, we identify knowledge with actionable information in the brain-mind. We justify knowledge by behavioral studies that demonstrate the existence of information structures implementing functions in the brain. And we verify knowledge scientifically.
Briefly, we find positive value (or good) in information structures. We see negative value (or evil) in disorder and entropy tearing down such structures. We call energy with low entropy "Ergo" and call anti-entropic processes "ergodic." We recognize that "ergodic" is itself too esoteric and thus not likely to be widely accepted. Perhaps the most positive term for what we value is just "information" itself!Our first categorical imperative is then "act in such a way as to create, maintain, and preserve information as much as possible against destructive entropic processes." Our second ethical imperative is "share knowledge/information to the maximum extent." Like love, our own information is not diminished when we share it with others Our third moral imperative is "educate (share the knowledge of what is right) rather than punish." Knowledge is virtue. Punishment wastes human capital and provokes revenge.
Briefly, we separate "free" and "will" in a two-stage process - first the free generation of alternative possibilities for action (which creates new information), then an adequately determined decision by the will. We call this two-stage view our Cogito model and trace the idea of a two-stage model in the work of two dozen thinkers back to William James in 1884. This model is a synthesis of adequate determinism and limited indeterminism, a coherent and complete compatibilism that reconciles
free will with both determinism and indeterminism. David Hume thought he had reconciled freedom with determinism. We reconcile free will with indeterminism and an "adequate" determinism. Because it makes free will compatible with both a form of determinism (really determination) and with an indeterminism that is limited and controlled by the mind, the leading libertarian philosopher Bob Kane suggested we call this model "Comprehensive Compatibilism." The problem of free will cannot be solved by logic, language, or even by physics. Man is not a machine and the mind is not a computer.
Free will is a property of a biophysical information processing system.
“questions which are already capable of definite answers are placed in the sciences, while those only to which, at present, no definite answer can be given, remain to form the residue which is called philosophy.”Information philosophy aims to show that problems in philosophy should not be reduced to “Russell’s Residue.” The language philosophers of the twentieth century thought that they could solve (or at least dis-solve) the classical problems of philosophy. They did not succeed. Information philosophy, by comparison, now has cast a great deal of light on some of those problems. It needs more information philosophers to join us to make more progress.
Introduction | Knowledge | Value | Freedom | Mind | Chance | Quantum | Afterword |