Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Daniel Boyd
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Nancy Cartwright
Gregg Caruso
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Tom Clark
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Austin Farrer
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Bas van Fraassen
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
Frank Jackson
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
Joseph Levine
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
Arthur O. Lovejoy
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
Tim Maudlin
James Martineau
Nicholas Maxwell
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
U.T.Place
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
John Duns Scotus
Arthur Schopenhauer
John Searle
Wilfrid Sellars
David Shiang
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
Peter Slezak
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

David Albert
Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Marcello Barbieri
Gregory Bateson
Horace Barlow
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Jean Bricmont
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Sadi Carnot
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Rudolf Clausius
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Bernard d'Espagnat
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
George F. R. Ellis
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Augustin-Jean Fresnel
Benjamin Gal-Or
Howard Gardner
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
James J. Gibson
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
Jeff Hawkins
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Basil Hiley
Art Hobson
Jesper Hoffmeyer
Don Howard
John H. Jackson
William Stanley Jevons
Roman Jakobson
E. T. Jaynes
Pascual Jordan
Eric Kandel
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Daniel Koshland
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
Karl Lashley
David Layzer
Joseph LeDoux
Gerald Lettvin
Gilbert Lewis
Benjamin Libet
David Lindley
Seth Lloyd
Werner Loewenstein
Hendrik Lorentz
Josef Loschmidt
Alfred Lotka
Ernst Mach
Donald MacKay
Henry Margenau
Owen Maroney
David Marr
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
N. David Mermin
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Vernon Mountcastle
Emmy Noether
Donald Norman
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Wilder Penfield
Roger Penrose
Steven Pinker
Colin Pittendrigh
Walter Pitts
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Zenon Pylyshyn
Henry Quastler
Adolphe Quételet
Pasco Rakic
Nicolas Rashevsky
Lord Rayleigh
Frederick Reif
Jürgen Renn
Giacomo Rizzolati
A.A. Roback
Emil Roduner
Juan Roederer
Jerome Rothstein
David Ruelle
David Rumelhart
Robert Sapolsky
Tilman Sauer
Ferdinand de Saussure
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Sebastian Seung
Thomas Sebeok
Franco Selleri
Claude Shannon
Charles Sherrington
Abner Shimony
Herbert Simon
Dean Keith Simonton
Edmund Sinnott
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Teilhard de Chardin
Libb Thims
William Thomson (Kelvin)
Richard Tolman
Giulio Tononi
Peter Tse
Alan Turing
C. S. Unnikrishnan
Francisco Varela
Vlatko Vedral
Vladimir Vernadsky
Mikhail Volkenstein
Heinz von Foerster
Richard von Mises
John von Neumann
Jakob von Uexküll
C. H. Waddington
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
Herman Weyl
John Wheeler
Jeffrey Wicken
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Günther Witzany
Stephen Wolfram
H. Dieter Zeh
Semir Zeki
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
Information in Biology

Despite many controversies about the role of information in biology over the past several decades, we can now show that the creation and communication of information is not only necessary to understand biology, but that biology is a proper, if tiny, subset of information creation in the material universe, including the evolution of human minds and the abstract ideas created or discovered by our minds.

Material information creation, in the form of planets, stars, and galaxies, went on for perhaps ten billion years before biological "agents" formed. At some time between three and four billion years ago, multicellular biological agents began to communicate with their component parts and with one another, processing and sharing information.

With the appearance of life, agency, purpose, meaning, and values entered the universe.
This is not a teleological purpose that pre-existed life. It is what Colin Pittendrigh, Jacques Monod, and Ernst Mayr suggested we call teleonomy, a "built-in" purpose. Aristotle called it "entelechy," which means "having a purpose within."

Mayr teaches us that biology is unlike physics and chemistry because it has a history. Our atoms and molecules do not contain information about where they have been in the past, nor do they have any control over where they will be in the future. Many mathematical physicists deny this. If determinism were the case, every particle path would contain that information. But a quantum analysis of statistical physics shows how microscopic path information is destroyed.

Mayr's biology is a history that goes back nearly four billion years. In astronomy we can trace the history of cosmic evolution back 13.7 billion years.

The goal for information philosophy is to write a new story of biological evolution as the growth of information processing, connecting it back into cosmological evolution as the creation of information structures, and illustrating the total dependence of biology on cosmological sources of negative entropy (potential information).

Material information structures formed in the early universe - the elementary particles, atoms and molecules, galaxies, stars, and planets - all as the result of gravitation and quantum cooperative phenomena. But it is not until the emergence of life that information replication and information processing begins. In a deep sense, biology is information processing.

Understanding the origin of life is to understand the concept of living information structures - biological agents that some call "interactors."

Biological systems can be viewed as patterns of information through which matter and energy flow.
Information processing is the signaling between interactors, using signs that have a syntax (linear sequences of symbols like words in a language) and a semantics (meaning), but most important, have pragmatic value. Interactors use meaningful information to control the use of matter and energy as they accomplish their purposes and goals. Interactors are powered by streams of matter and energy with low entropy. This flow of negative entropy (potential information) comes proximately from the Sun but ultimately from the expansion of the universe, which must be the starting point for explaining life.

An interactor is a clearly definable agent who interacts with other agents by signaling, by coding the abstract information of a message into a material or energetic carrier that travels from the sender to the receiver. The receiver decodes the message and takes an action that depends on the meaning in the message. Interactors are information structures with internal functioning parts that make up an operational organic whole, so messages may be internal to an interactor or external between two interactors.

The messages between interactors - cells, cell components like organelles, or cell combinations in multicellular organisms or communities of cells - are not (merely) "analogous" to human communication or "helpful metaphors." Biological communications have been present from the origin of life. Human communications are in every sense merely a very recent and very special example of (and homologous to) biological communication, despite the fact that most humans think communication is uniquely human. Without the communication and processing of meaningful information, there would be no life.

To understand biology - and humanity - we must understand how the communication and processing of information allows living information structures to carry out their functions. And, thanks to biological evolution, we have the tool needed to understand biology and our selves, the extraordinary human mind.

Human minds are the most advanced information processors in the known universe.
Human minds have created everything we know about how abstract immaterial information and concrete information structures (matter and energy with low entropy) have been created in the universe. Humans are co-creators of the universe, specifically the part known as "culture" as opposed to "nature," including all human artifacts and all human knowledge.

Beyond Reductionism? Even Beyond Darwinian Evolution?
Is there something "holistic" happening in biology, something that cannot be explained by the material parts following the fundamental laws of chemistry and physics? The great quantum physicist Erwin Schrödinger thought life should be reducible to physics and chemistry, but that, he said, might require "new laws" of physics. This led some great scientists to turn from physics to biology to find those new laws.

In the nineteenth century, "emergentists" and "vitalists" argued for new properties for complex organisms that are not present in their component parts, or for new "forces" that infuse mere matter with life. The most obvious evidence for something non-material was that life appears to violate the then recently discovered second law of thermodynamics. Many philosophers and scientists thought that life, and especially mind, could not be possible unless proto-life and proto-mind were already present in all physical particles. This is known as "pan-psychism."

Panpsychism is still very popular among thinkers at the edges of science and religion. Where René Descartes considered mind (res cogitans) as something ontologically and substantially distinct from matter (res extensa), thus creating the mind-body problem, most of his contemporaries (e.g., Gottfried Wilhelm Leibniz and Baruch Spinoza) and philosophers down to the twentieth century thought mind an essential aspect of matter.

The idea of emergence was implicit in the work of John Stuart Mill and explicit in
the work of "emergentists" like George Henry Lewes, Samuel Alexander, C. Lloyd Morgan, and C. D. Broad. Some wanted to explain the direct emergence of mind from matter, to solve the mind-body problem, but as Alexander put it, there are at least two distinct steps - first life emerges from the physical-chemical, then mind emerges from life.

Charles Sanders Peirce thought the universe must have a property like mind. He thought the laws of nature were evolving "habits" of such a mind. He thought the chance element in Darwinism made it "greedy." Alfred North Whitehead's "Process Philosophy" is panpsychic, very popular today as "Process Theology." The French philosopher Henri Bergson argued that the creative aspect of evolution cannot be explained by random variation and selection.

Vitalists like Bergson and Hans Driesch may not have used the term emergence, but they strongly supported the idea of teleological (purposeful), likely non-physical causes, without which they thought that life and mind could not have emerged from physical matter.

Influenced by his countryman Bergson and the evolutionary philosopher Georg W. F. Hegel, the philosopher and Jesuit priest Pierre Teilhard de Chardin imagined an "Omega Point" toward which the universe and life are evolving, with the mind of man participating in the final evolutionary phase, a "noösphere" or mind sphere similar to what might be called an "infosphere," the locus of our Sum of human knowledge..

What these thinkers all have in common is that they believe that the material basis of living things cannot possibly have a creative power. In general, they are all looking for something more powerful than the neo-Darwinian, modern synthesis of biology, in which evolution is driven by variations that depend on ontological chance. A significant fraction of scientists, philosophers, and theologians today have what William James called "antipathy to chance."

The new sciences of complexity and chaos theory are believed by many to provide a new explanation of the origin of life. Ilya Prigogine correctly described the reductions in local entropy of what he called "dissipative structures" as bringing "order out of chaos." Complexity and chaos are deterministic, dynamical theories that generate epistemological uncertainty, which is enough for some thinkers who dislike ontological chance. Complex phenomena very likely supported the formation of pre-biotic structures, "autocatalytic" combinations of elements that synthesized some of their own parts. But no classical dynamical phenomena can explain the origin of genuinely new information in the universe. That requires quantum chance.

A Benevolent, Providential Universe

Before there was life, the galaxies, stars, and planets had a rich developmental and evolutionary history of their own. Astrophysics tells us that stars radiated energy into space as they dissipated the energy of gravitational collapse (the photons carrying away positive entropy to balance the new spherically symmetric order). The stars paused their collapsing when their interiors reached temperatures high enough to initiate thermonuclear reactions, which converted the lightest elements (hydrogen and helium) into heavier elements. When the fuel is exhausted, the stars resume collapsing, some exploding catastrophically and spreading their newly formed elements out into interstellar space.

Geophysics tells us that the surfaces of planets also go through heating, then cooling, as they radiate away the energy of gravitation. Chemical processes produce ever more complex molecules in the dust and gas clouds of interstellar space and on planetary surfaces. They were produced in the stars as well, but usually they disassociate quickly in a hot star.

When a planet is bathed by radiation from a nearby star, the radiation field is far from equilibrium. What were positive entropy, high-temperature, high-density photons leaving the stellar surface are now spread out over a huge volume of space. Their temperature is still high, but their energy density corresponds to a much lower temperature. They are now far from thermal equilibrium, but they cannot cool down without interacting with matter.

When the stream of radiative energy interacts with the planet surface, it provides the necessary "free" energy (with low entropy) to form even more complex information structures, including many of the component macromolecules that are the chemical basis for life. Some of these "prebiotic" macromolecules are produced in small quantities by the ordinary statistics of chemical reactions. These include some macromolecules that play a central role in life today (polynucleotide chains, for example), but they were not organized to work together in any way, and probably had very short lifetimes even though their environment was far from equilibrium.

Life needs the emergence of structures that can replicate macromolecules, multiplying their information. The earliest such replicating information structures probably duplicated a macromolecule on an external template (a catalyst). Replication does not produce new information, only copies of pre-existing information. To be sure, copying errors are new information, and those errors might be themselves replicated to provide the random changes needed in the first step of evolution by variation and natural selection.

But replication, even "self-replication," is not enough to produce the biological world. Billions of years before human beings invented machines, especially our computing machines, complex biological processes evolved that process information and construct new information structures, in ways that resemble how our computer-controlled machines process abstract information and manufacture new material objects, including new "machines."

We are tempted to call these biological processes "biological machines," but we must deprecate the terms "machine," "mechanical," "dynamics," and "dynamical," because they imply processes operating under the laws of classical, deterministic, physics and under the constraints of initial and boundary conditions. Such machines are in principle completely predictable, but in practice unpredictable, because of their sensitivity to the initial conditions and boundary conditions (the "constraints").

Complexity and chaos theories are classical and dynamical, continuous and causal. They cannot create novelty.
This unpredictability gives us epistemological randomness. Complexity and chaos make us ignorant about what is happening in the microscopic world. But they cannot provide the ontological chance produced only by events in quantum physics (we also deprecate quantum "mechanics") Without real chance, all the evolution and development of biological species is implicitly pre-existent in the laws of nature, in principle knowable by a super-intelligent Laplacean demon or an infinite omniscient god.

In such a world, information is a constant quantity, with a conservation law like that for matter and energy. Many, perhaps most, mathematical physicists subscribe to this view. In the world of classical physics, there is no novelty, "nothing new under the sun." For biology, this is consistent with the idea of intelligent design. Believers in an omnipotent god might allow novel changes over time, but this is logically inconsistent with the omniscience of god, for whom there is only one possible future, the one consistent with god's foreknowledge.

A surprisingly large number of thinkers appear to be happy with the idea of a single possible future.
The recent enthusiasm for complexity and chaos theories, which emphasize the inability to predict the future, in no way gives us the novelty and freedom, the alternative possible futures that are produced by quantum physics. Many scientists and philosophers may be quite content with this view, since it preserves the idea of a controlling, continuous, causal process, even if we cannot know it exactly. The majority of philosophers prefer a free will model that is compatible with determinism. And a significant fraction of scientists hope to deny the collapse of the wave function that generates ontological indeterminacy.

Using the critically important free energy (with low entropy) streaming from the Sun to the Earth (and beyond to the night sky and cosmic microwave background), these extraordinary biological processes manufacture macromolecular "machines" that in turn build macromolecules that cannot be produced spontaneously or reproduced by replication alone, especially the complex proteins (polypeptide chains) that are the moving parts of our biological machines and information processors.

The Origin of Life
The sine qua non of life is a flow of energy with negative entropy. The current major source is the solar energy/entropy/information from the Sun, but a possible source for the original macromolecules of life are the deep-sea vents of residual cooling energy (or radioactive decay energy) at the bottom of the seas.

In either case, the far-from-equilibrium energy flow is essentially a thermodynamic engine, running between the hot energy source of the Sun and the cool sink of the night sky as the earth rotates. Energy flows lead to material flows in gases and liquids that produce dynamic systems with semi-permanent visible structures like convection cells.

At some moment, a primitive macromolecule replicated itself. To do this, it created new (duplicate) information, so positive entropy equal to or greater than the new negative entropy must have been carried away from the new information structure (for it to be stable).

Mere replication should not yet be seen as life. And anything like metabolism would just be the flow of the low entropy solar photons that get degraded in the process of providing the available energy needed to form the new molecule. But we might anthropomorphize a bit and say that the apparent purpose of the molecule appears to be replicating itself, increasing its own kind of information structure in the universe.

Now at some point the replication might have been less than perfect (note the critical element of chance here).

Imagine now that the new molecule might be even more efficient than the original molecule at replicating itself (it has greater reproductive success). Note that the new molecule has more information in it than the original. Now we might say that this is the beginning of Darwinian evolution, which appears to have a goal or purpose of building richer information structures.

We now have both primitive inheritance (of the information) and a form of variability. Some of these new molecules might not only be more successful replicators, they might have chemical properties that allow them to resist being destroyed by environmental conditions - the energetic extreme ultraviolet photons, for example, or destructive cosmic rays, which might have been the source of the original variations.

The result might be a runaway exponential explosion of the concentration of those molecules (an important characteristic of living systems).

Replication could lead to populations of the molecule that are well beyond the normal populations that would be expected in chemical equilibrium. Chemists might view this as simply an autocatalytic process, in which the molecule catalyzes its own production. But because it is information replicating itself, it is qualitatively different from mere chemical autocatalysis.

At the atomic level, it will be quantum cooperative phenomena that pull the constituent atoms into the desired molecular shape. It is the overall shape (form, information) that produces a dynamical constraint well beyond the mere aggregate of individual atoms interacting.

Grossly speaking, the new, more successful species of molecule have "learned" something, storing the new information internally to pass it on to the next generation.

Molecular Machines
Biological communications, the information exchanged in messages between biological entities, is far more important than the particular physical and chemical entities themselves. These material entities are used up and replaced many times in the life cycle of a whole organism, while the messaging has remained constant, not just over the individual life cycle, but that of the whole species.

In fact most messages, and the specific molecules that embody and encode those messages, have been only slowly varying for billions of years.

As a result, the sentences (or statements or “propositions”) in biological languages may have a very limited vocabulary compared to human languages. Although the number of words added to human languages in a typical human lifetime is remarkably small.

Biological information is far more important than matter and energy for another reason. Beyond biological information as “ways of talking” in a language, we will show that the messages do much more than send signals, they encode the architectural plans for biological machines that have exquisite control over individual molecules, atoms, and their constituent electrons and nuclei.

Far from the materialist idea that fundamental physical elements have “causal control” over living things, we find that biological information processing systems are machines, intelligent robotic machines, that assemble themselves and build their own replacements when they fail, and that use the flow of free energy and material with negative entropy to manipulate their finest parts.

Coming back to the great philosopher of logic and language Ludwig Wittgenstein, who briefly thought of “models” as explanatory tools that can “show” what is difficult or impossible to “say” in a language, we offer dynamic animated models below.

The amazing operations of these machines are so far beyond man-made machines that it has called into question the ability of Darwinian evolution to create them by random trials and errors.

But the most complex of these machines have been shown to be composed of dozens of smaller and simpler parts that did and still do much simpler tasks in the cell.

The five biological machines that we choose are

• the ribosome, a massive factory that manufactures thousands of different possible proteins when messenger RNA carries a request for one of them from the nuclear DNA,

• ATP synthase, which packages small amounts of energy into a nucleotide molecule that carries energy to any place in the organism that needs power to perform its function,

• the flagellum, a high-speed motor that moves bacterial cells to sources of matter and energy in their environment,

• the ion pump, which moves calcium and potassium ions to rapidly recharge the activation potential of a neuron so it is ready to fire again in a fraction of a second so the mind can make its decisions and take actions to move the body,

• the chaperone, an error detection and correction system beyond the ability of our finest computers to protect memories from noise.

Arthur Horwich Yale Lecture
Ken Dill TEDx talk
Chaperonin structure
Visualizing Proteins
Computing Protein Structures?

Drew Berry TEDx talk

Biology cannot prevent the occurrence of random errors. Indeterministic chance is the original source of variability in our genes that led to the incredible diversity of life forms, including us humans.

But the nearly perfect operation of our biological machines and the phenomenal fidelity of copying our many genetic codes over billions of years shows the stability and “adequate determinism” of biology in the presence of ontological chance, a consequence of the noise-immune digital nature of biological information.

The New History of Cosmic and Biological Evolution
From an information philosophy perspective, the teleonomic purpose of all life has been to replicate itself and to improve its reproductive success, which is to say to replicate and expand its information.

Richard Dawkins considered the possibility that genes are the driving force in reproduction - of their own information. Thus biological organisms are seen as the means by which the genes replicate. That may be contentious. But there is no question that over time, evolution has produced organisms that can both create and replicate more and more complex information processors, with humans the most complex.

Through the admittedly narrow lens of information, the story of evolution begins with the cosmic creation of information structures (which do not communicate), followed by the evolution of biological information processors, our "interactors." These are products of the continuous stream of negative entropy coming from the cosmos, directly from the sun, but made possible by the expansion of the universe.

Those of us living through the twentieth century were the first to see as close to the beginning of the universe as anyone will ever see. We also learned that the future of the universe is essentially infinite, relative to the lifespan of humans. In the middle of that century, around the time of one of the most destructive wars in world history, we transformed the meaning of "information" from informing one another to the analysis of language and all knowledge that breaks them into "bits."

These "binary digits," "1 or 0" answers to "yes or no" questions, appeared first as the stuff of digital computers and digital communications. But at about the same time, biologists discovered that the hereditary material of all life is similarly encoded.

Alan Turing, John von Neumann, and Claude Shannon created digital computers just a few decades after Albert Einstein's prediction about Brownian motion proved the existence of atoms. Matter consists of discrete discontinuous particles. Within a few years , James Watson and Francis Crick showed that life was based on a digital code. The coded sequence of nucleotides in the DNA molecule at the heart of the cell nucleus is transcribed, rewritten as messenger RNA that travels outside the nucleus with instructions to a ribosome to translate the genetic code into a matching linear chain of amino acids, a protein that folds itself to become a three-dimensional active enzyme.

If digital computer codes had not already been invented, might the biological model have inspired them? Instead of computational models of the mind, we might have appreciated the mind as the ultimate extension of biological information processing!

The Origin and Future of Knowledge

Jumping now to human evolution, we see humanity as a species of multi-molecular, multi-cellular organism that has found a way to not only create but also externalize information, storing it in the environment (culture), where it can be shared with new humans, who continue to add to this external store of knowledge. It is knowledge that has allowed humans to dominate the Earth, for better or for worse.

The sharing of old and new information created with all other humans has enormous economic and moral implications.

From a cosmologist's perspective, in the macrocosmos the human mind is the universe's means of reflection. In the microcosmos, it is one atom's way of knowing about other atoms.

We call all of human knowledge the Sum, to go along with our free will model, the Cogito, and our basis for objective values, the Ergo.

The Sum of human knowledge is vast of course, but we propose drafting suggestions for the most important things we know that should be known by everyone in the future.

  • In the nineteenth century, Herbert Spencer asked, "What knowledge is of most worth?" Science was his answer.

  • Richard Feynman proposed that "If, in some cataclysm, all of scientific knowledge were to be destroyed, and only one sentence passed on to the next generations of creatures, what statement would contain the most information in the fewest words? I believe it is the atomic hypothesis (or the atomic fact, or whatever you wish to call it) that all things are made of atoms—little particles that move around in perpetual motion, attracting each other when they are a little distance apart, but repelling upon being squeezed into one another. In that one sentence, you will see, there is an enormous amount of information about the world, if just a little imagination and thinking are applied."

  • Today we can extend Feynman's insight. The universe consists of discrete, discontinuous, and in some sense "digital," particles. There is no "classical" world, only a quantum world. The "classical" world emerges from the quantum world when a large enough number of particles get together. The continuous space (and time) in which we locate the particles is but a mathematical construct that allows us to describe the world.There are no continuous "fields" in which particles of matter (electrons, atoms, etc.) are thought to be singularities. The continuous, causal "forces" like gravity that we postulate are useful fictions. They are only statistical averages over other types of particles (photons, bosons, gravitons) that look continuous when very many such particles are present. At the microscopic level, quantum events are discontinuous and acausal. The analytic integral and differential equations that we assume deterministically govern the motions of material particles are idealizations only accurate for very large bodies.

References

Definitions of Life

Information Processing IS Biology

Information Theory in Biology (U. Illinois, 1952)

Information Theory in Biology (Gatlinburg, Tennessee, 1956)

Origin of Life

Towards a Theoretical Biology (IUBS Symposia, 1966-67)

The Major Transitions

For Teachers
For Scholars

Quotes
"Life may be defined operationally as an information processing system—a structural hierarchy of functioning units—that has acquired through evolution the ability to store and process the information necessary for its own accurate reproduction. The key word in the definition is information. This definition, like all definitions of life, is relative to the environment. My reference system is the natural environment we find on this planet. However, I do not think that life has ever been defined even operationally in terms of information. This entire book constitutes a first step toward such a definition."

Evidently nature can no longer be seen as matter and energy alone. Nor can all her secrets be unlocked with the keys of chemistry and physics, brilliantly successful as these two branches of science have been in our century. A third component is needed for any explanation of the world that claims to be complete. To the powerful theories of chemistry and physics must be added a late arrival: a theory of information. Nature must be interpreted as matter, energy, and information.

"A central and fundamental concept of this theory is that of "biological information." since the material order and the purposiveness characteristic of living systems are governed completely by information, which in turn has its foundations at the level of biological macromolecules . The question of the origin of life is thus equivalent to the question of the origin of biological information."

"the evolutionary process is driven by an enormous flow ot thermodynamic information passing through the earth's biosphere."

"Information as the central concept in molecular biology
Information, transcription, translation, code, redundancy, synonymous, messenger, editing, and proofreading are all appropriate terms in biology. They take their meaning from information theory (Shannon, 1948) and are not synonyms, metaphors, or analogies.."


Chapter 1.1 - Creation Chapter 1.3 - Information
Home Part Two - Knowledge
Normal | Teacher | Scholar