Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Daniel Boyd
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Nancy Cartwright
Gregg Caruso
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Austin Farrer
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Bas van Fraassen
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
Frank Jackson
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
Joseph Levine
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
Arthur O. Lovejoy
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
Tim Maudlin
James Martineau
Nicholas Maxwell
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

David Albert
Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Marcello Barbieri
Gregory Bateson
Horace Barlow
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Jean Bricmont
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Sadi Carnot
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Rudolf Clausius
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Bernard d'Espagnat
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
George F. R. Ellis
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Benjamin Gal-Or
Howard Gardner
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
James J. Gibson
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
Jeff Hawkins
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Basil Hiley
Art Hobson
Jesper Hoffmeyer
Don Howard
John H. Jackson
William Stanley Jevons
Roman Jakobson
E. T. Jaynes
Pascual Jordan
Eric Kandel
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Daniel Koshland
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
Karl Lashley
David Layzer
Joseph LeDoux
Gerald Lettvin
Gilbert Lewis
Benjamin Libet
David Lindley
Seth Lloyd
Hendrik Lorentz
Werner Loewenstein
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
Owen Maroney
David Marr
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
N. David Mermin
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Vernon Mountcastle
Emmy Noether
Donald Norman
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Wilder Penfield
Roger Penrose
Steven Pinker
Colin Pittendrigh
Walter Pitts
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Zenon Pylyshyn
Henry Quastler
Adolphe Quételet
Pasco Rakic
Nicolas Rashevsky
Lord Rayleigh
Frederick Reif
Jürgen Renn
Giacomo Rizzolati
Emil Roduner
Juan Roederer
Jerome Rothstein
David Ruelle
David Rumelhart
Tilman Sauer
Ferdinand de Saussure
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Sebastian Seung
Thomas Sebeok
Franco Selleri
Claude Shannon
Charles Sherrington
David Shiang
Abner Shimony
Herbert Simon
Dean Keith Simonton
Edmund Sinnott
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Teilhard de Chardin
Libb Thims
William Thomson (Kelvin)
Richard Tolman
Giulio Tononi
Peter Tse
Alan Turing
Francisco Varela
Vlatko Vedral
Mikhail Volkenstein
Heinz von Foerster
Richard von Mises
John von Neumann
Jakob von Uexküll
C. S. Unnikrishnan
C. H. Waddington
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
Herman Weyl
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Günther Witzany
Stephen Wolfram
H. Dieter Zeh
Semir Zeki
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
Terrence Deacon

Terrence Deacon is professor of Biological Anthropology, Neuroscience, and Linguistics at University of California-Berkeley.

In his 1997 book. The Symbolic Species: The Co-evolution of Language and the Brain, he argued that language coevolved by natural selection with the brain, although he now argues that the major source of language acquisition is social transmission, with a trial-and-error process analogous to natural selection occurring while the brain develops.

Deacon's 2011 work Incomplete Nature has a strong triadic structure, inspired perhaps by an important influence from semiotics—the philosopher Charles Sanders Peirce's triad of icon, index, and symbol. Deacon's triad levels represent the material, the ideal, and the pragmatic. The first two levels reflect the ancient philosophical dualism of materialism and idealism, or body and mind, respectively. The major transition from the nonliving to the living - the problem of abiogenesis, and the introduction of telos in the universe - happens in Deacon's third level.

Teleodynamics is Deacon's name for the third level in his dynamics hierarchy. It is built on and incorporates the two lower levels — the first level is physical and material, the second adds an informational and immaterial aspect.

At the bottom level is the natural world, which Deacon characterizes by its subjection to the second law of thermodynamics. When entropy (the Boltzmann kind) reaches its maximum, the equilibrium condition is pure formless disorder. Although there is matter in motion, it is the motion we call heat and nothing interesting is happening. Equilibrium has no meaningful differences, so Deacon calls this the homeodynamics level, using the root homeo-, meaning "the same." There are no meaningful differences here.

At the second level, form (showing differences) emerges. Deacon identifies a number of processes that are negentropic, reducing the entropy locally by doing work against and despite the first level's thermodynamics. This requires constraints, says Deacon, like the piston in a heat engine that constrains the expansion of a hot gas to a single direction, allowing the formless heat to produce directed motion.

Atomic constraints such as the quantum-mechanical bonding of water molecules allow snow crystals to self-organize into spectacular forms, producing order from disorder. Deacon dubs this second level morphodynamic. He sees the emerging forms as differences against the background of unformed sameness. His morphodynamic examples include, besides crystals, whirlpools, Bénard convection cells, basalt columns, and soil polygons, all of which apparently violate the first-level tendency toward equilibrium and disorder in the universe. These are processes that information philosophy calls ergodic.

Herbert Feigl and Charles Sanders Peirce may have been the origin of Bateson's famous idea of a "difference that makes a difference."
On Deacon's third level, "a difference that makes a difference" (cf. Gregory Bateson and Donald MacKay) emerges as a purposeful process we can identify as protolife. The quantum physicist Erwin Schrödinger saw the secret of life in an aperiodic crystal, and this is the basis for Deacon's third level. He ponders the role of ATP (adenosine triphosphate) monomers in energy transfer and their role in polymers like RNA and DNA, where the nucleotide arrangements can store information about constraints. He asks whether the order of nucleotides might create adjacent sites that enhance the closeness of certain molecules and thus increase their rate of interaction. This would constitute information in an organism that makes a difference in the external environment, an autocatalytic capability to recruit needed resources. Such a capability might have been a precursor to the genetic code.

Deacon crafts an ingenious model for a minimal "autogenic" system that has a teleonomic (purposeful) character, with properties that might be discovered some day to have existed in forms of protolife. His simplest "autogen" combines an autocatalytic capability with a self-assembly property like that in lipid membranes, which could act to conserve the catalyzing resources inside a protocell.

Autocatalysis and self-assembly are his examples of morphodynamic processes that combine to produce his third-level, teleodynamics. Note that Deacon's simplest autogen need not replicate immediately. Like the near-life of a virus, it lacks a metabolic cycle and does not maintain its "species" with regular reproduction. But insofar as it stores information, it has a primitive ability to break into parts that could later produce similar wholes in the right environment. And the teleonomic information might suffer accidental changes that produce a kind of natural selection.

Deacon introduces a second triad he calls Shannon-Boltzmann-Darwin (Claude, Ludwig, and Charles). He describes it on his Web site www. teleodynamics.com. I would rearrange the first two stages to match his homeodynamic-morphodynamic-teleodynamic triad. This would put Boltzmann first (matter and energy in motion, but both conserved, merely transformed by morphodynamics). A second Shannon stage then adds information (Deacon sees clearly that information is neither matter nor energy); for example, knowledge in an organism's "mind" about the external constraints that its actions can influence.

This stored information about constraints enables the proto-organism in the third stage to act in the world as an agent that can do useful work, that can evaluate its options, and that can be pragmatic (more shades of Peirce) and normative. Thus Deacon's model introduces value into the universe— good and bad (from the organism's perspective). It also achieves his goal of explaining the emergence of perhaps the most significant aspect of the mind: that it is normative and has goals. This is the ancient telos or purpose.

Appreciating Deacon's argument is easier with a little history. Claude Shannon's information theory produced an expression for the potential information that can be carried in a communication channel. It is the mathematical negative of Boltzmann's formula for entropy.

S = k log W

Confusingly, John von Neumann suggested that Shannon use the word entropy for his measure of information. Then Leon Brillouin coined the term negentropy to describe far-from-equilibrium conditions in the world epitomized by information. Since Erwin Schrödinger, we have known that life is impossible without the negative-entropy flow of far-from-equilibrium available energy from the sun.

Shannon entropy (which is negentropy) describes the large number of possible messages that could be encoded in a string of characters. Shannon's actual information reduces the uncertainty in the entropy of potential messages. Deacon notes correctly that new information can be transmitted only if these alternative possibilities exist. Without probability (ontological chance) and true alternative possibilities, there would be no information in the message.

"No possibilities = no uncertainty = no information," Deacon says. Without something new, the amount of information in the universe would be fixed. This is deeply true.

Organisms are not machines, and minds are not computers, says Deacon, criticizing cognitive scientists who seek a one-to-one correspondence between conscious thoughts or actions and neuronal events. Machines are assembled from parts, whereas organisms self-assemble, he insightfully observes.

Computers are designed to be totally predictable logical devices that are noise-free, but organisms and the mind could not survive if they worked that way, because the universe continually generates unpredictable new situations. The mind supervenes on astronomical numbers of neuronal events, which likely transmit far more stochastic noise than they do meaningful signals. Deacon thinks that meaningful mental events are probably only statistical regularities, averages over neuronal events, just as macroscopic classical properties are averages over quantum-level events.

Deacon's interest in the etymology of words is fascinating, but his love of symbols leads him to use neologisms that make his sentences too dense, often obscuring his excellent ideas. He does provide a glossary of his newly coined terms, but these are difficult to keep in mind while reading his text.

For example, Deacon uses homeodynamic for his first level instead of the standard term thermodynamic, which he does use occasionally and which would have been more clear. Then, instead of morphodynamic for the second level where information structures appear, he might have used negentropic (implying Shannon entropy and information creation). For his third level, teleodynamic is fine, but I'd have chosen the well-known term teleonomic, suggested by Colin Pittendrigh, and used by Ernst Mayr and Jacques Monod, whose Nobel colaureate François Jacob said that "the goal of every cell is to become two cells."

Deacon's triadic levels (compare Peirce and Hegel)
  • Homeodynamic - a dynamic process in which a system is approaching thermodynamic equilibrium - perhaps thermodynamic, which Deacon sometimes uses, would be clearer?
  • Morphodynamic - describes a system spontaneously organizing, lowering its entropy, increasing information structures
  • Teleodynamic - two morphodynamic systems, one self-organizing, the other autocatalytic, which together exhibit an internal purpose - an "end" or "telos," namely to use the flow through them of negative entropy (matter and energy), enabling them to act (pragmatically) to maintain themselves. (One might ask what exactly it is about Deacon's combination of two systems that adds the telos. Both "self"-organizing and "auto"-catalytic systems exhibit what Howard Pattee calls the self/non-self distinction or "epistemic cut.")

Deacon objects to calling his third level teleonomic, which was created explicitly to remove the theological "intelligent design" elements of the term teleological.

Deacon defines teleonomic as "teleological in name only" (see glossary below), which is odd considering the historical purpose of the term in biology, which was introduced by Colin Pittendrigh in 1958, used by Jacques Monod in his 1971 Chance and Necessity, and clarified by Ernst Mayr in his 1974 article Teleological and Teleonomic: A New Analysis, his 1988 book Toward a New Philosophy of Biology, and his 1992 article The Idea of Teleology.

What does Deacon add into his teleodynamic that goes beyond teleonomic? He defines his teleodynamic as "exhibiting end-directedness" and then adds the highly specific and technical criteria "consequence-organized features constituted by the co-creation, complementary constraint, and reciprocal synergy of two or more strongly coupled morphodynamic processes."

Deacon's major work is to model computer-based simulations of these combined morphodynamic processes to better understand their properties, so he is entitled to his technical definitions, if they are essential to his dynamical computational models.

His current major goal is to understand how his simple autogen model can combine with information theory to explain the concepts of "reference" and "significance." He variously defines reference as "aboutness" or "re-presentation," the semiotic or semantic relation between a sign-vehicle and its object. He describes significance as the pragmatic dimension of "value," "normativity," "purpose," "interpretation," "function," "usefulness," "end-directed," and "goal-state." "Work is the relevant measure when it comes to assessing the usefulness of information," he says.

Reference is the simple connection between an abstract idea (re-presentation in the mind) and its material (or conceptual) object. In linguistics, reference is the semantic connection between a word and its (dictionary) meaning. For Saussure it is independent of context or Peircean interpretation. Deacon may take a reference as not involving any physical work.

Roman Jakobson added "context" to Claude Shannon's theory of communications, which understandably ignored the "meaning" in a message to study only channel capacity.

By contrast, significance is the pragmatic or functional value of an idea or a sign/symbol when it is interpreted in context by an agent (Peirce's interpretant). The agent must act on a meaningful message (where "meaning" is now not merely the standard reference of the symbol, but what the message means in the context of the future behavior of the agent, e.g. love or hate?, flight or fight?). An action normally involves physical work, as Deacon correctly notes. And in the context of his purely dynamical, arguably inanimate, autogen, that is appropriate. But for animals and humans, pragmatic consequences may only generate internal thoughts, ideas, judgments, and emotions, feelings, desires, that generate possibilities for willful actions after a careful evaluation and decision.

In any case, Deacon is right to distinguish reference and significance (as semantics and pragmatics) and try to understand them in terms of his two morphodynamic processes. But is he right to say that teleodynamics is in some way more purposeful than a teleonomic process, a process that has its purpose "built-in," what Aristotle called "entelechy" (from the Greek en-telos-echein, in-purpose-have)?

Let's carefully read Deacon's difference (that makes a difference) between teleonomic and teleodynamic proposed for his 2015 workshop. He describes

a long-standing debate in the natural sciences over the role of teleology in scientific explanations. This debate was presumed settled in middle of the last century with the development of cybernetic models of goal-directed behavior, such as in guidance systems and adaptive computer algorithms. Systems organized in this way are described as teleonomic rather than teleologic, to indicate that no intrinsic representation of an end is responsible for this behavior, only a systemic deviation-minimizing regulatory mechanism. In contrast, we argue that an interpretive process can only be adequately defined with respect to a process that is organized so that the goal-state contributes to the maintenance of the system with the disposition to attain that state, not merely some arbitrary physical state of things. Deacon (2009, 2012) terms this a teleodynamic process.
Deacon's glossary also contains Ernst Mayr's teleomatic, which Mayr meant to single out systems that are purely mechanical and dynamical, obeying physical laws. Can this include a thermostat (Deacon's "systemic deviation-minimizing regulatory mechanism"), with the appearance of goal-directed behaviors? Deacon defines teleomatic as "Automatically achieving an end, as when a thermodynamic system develops toward equilibrium or gravity provides the end state for a falling rock."

Let's review Mayr's careful and important distinction between teleological, teleonomic and teleomatic

What is teleology, and to what extent is it a valid concept? These have been burning questions since the time of Aristotle. Kant based his explanation of biological phenomena, particularly of the perfection of adaptations, on teleology — the notion that organisms were designed for some purpose...And the numerous autogenetic theories of evolution, such as orthogenesis, nomogenesis, aristogenesis, and the omega principle (Teilhard de Chardin), were all based on a teleological world view. Indeed, as Jacques Monod (1971) rightly stressed, almost all of the most important ideologies of the past and the present are built on a belief in teleology.

It is my belief that the pervasive confusion in this subject has been due to a failure to discriminate among very different processes and phenomena, all labeled "teleological." The most important conclusion of the recent research on teleology is that it is illegitimate to extrapolate from the existence of teleonomic processes (that is, those directed or controlled by the organism's own DNA) and teleomatic processes (those resulting from physical laws) to an existence of cosmic teleology. There is neither a program nor a law that can explain and predict biological evolution in any teleological manner. Nor is there, since 1859, any need for a teleological explanation: The Darwinian mechanism of natural selection with its chance aspects and constraints is fully sufficient.

The study of genetics has shown that seemingly goal-directed processes in a living organism (teleonomic processes) have a strictly material basis, being controlled by a coded genetic program.

We must distinguish non-physical and immaterial. Information is physical but immaterial.
Deacon has now given us a specific model for the locus of the telos. He says that the first material particles, the first atoms forming molecules, the first stars, and so on, can be explained without reference to anything non-physical. But since these are formed by what he calls morphodynamic processes, they must also involve some immaterial information generation. They are information structures. Information philosophy shows that without the expansion of the universe and ontological chance arising from quantum uncertainty, no new information could have come into existence from an assumed original state of thermodynamic equilibrium. There would be no galaxies, no stars, no planets, no life, no minds, no creative new thoughts, and in particular, no telos.

Why does Deacon describe nature as incomplete? Because information seems non-physical (it is actually physical, just not material), he says, we lack a scientific understanding of how words and sentences refer to atoms of meaning. The meanings of words and thoughts, the contents of the mind — especially goals and purposes — are "not present," he says. He reifies this absence and says cryptically that "a causal role for absence seems to be absent from the natural sciences." He calls this a "figure/ground reversal" in which he focuses on what is absent rather than present, likening it to the concept of zero, the holes in the "(w)hole." We can agree with Deacon that ideas and information are immaterial, neither matter nor energy, but they need matter to be embodied and energy to be communicated. And when they are embodied, they are obviously present (to my mind) — in particular, as those alternative possibilities (merely potential information) in a Shannon communication, those possibilities that are never actualized.

A review in the journal BioScience of Deacon's Incomplete Nature.

Deacon on Information (from Incomplete Nature)
TWO ENTROPIES (see our Entropy Flows)

To the extent that regularity and constraint provide a necessary background, for deviation and absence to stand out, nature's most basic convergent regularity must provide the ultimate ground for information. This regularity is of course the spontaneous tendency for entropy to increase in physical systems. Although Rudolf Clausius coined the term entropy in 1865, it was Ludwig Boltzmann who in 1866 recognized that this could be described in terms of increasing disorder. We will therefore refer to this conception of thermodynamic entropy as Boltzmann entropy.

This reliably asymmetric habit of nature provides the ultimate background with respect to which an attribute of one thing can exemplify an attribute of something else. The reason is simple: since non-correlation and disorder are so highly likely, any degree of orderliness of things typically means that some external intervention has perturbed them away from this most probable state. In other words, this spontaneous relentless tendency toward messiness provides the ultimate slate for recording outside interference. If things are not in their most probable state, then something external must have done work to divert them from that state.

A second use of the term entropy has become widely applied to the assessment of information, and for related reasons. In the late 1940s, the Bell Lab mathematician Claude Shannon demonstrated that the most relevant measure of the amount of information that can be carried in a given medium of communication (e.g., in a page of print or a radio transmission) is analogous to statistical entropy. According to Shannon's analysis, the quantity of information conveyed at any point is the improbability of receiving a given transmitted signal, determined with respect to the probabilities of all possible signals that could have been sent. Because this measure of signal options is mathematically analogous to the measure of physical options in thermodynamic entropy, Shannon also called this measure the "entropy" of the signal source. I will refer to this as Shannon entropy to distinguish it from thermodynamic entropy (though we will later see that they are more intimately related than just by analogy).

Consider, for example, a coded communication sent as a finite string of alphanumeric characters. If each possible character can appear with equal probability at every point in the transmission, there is maximum uncertainty about what to expect. This means that each character received reduces this uncertainty, and an entire message reduces the uncertainty with respect to the probability that any possible combination of characters of that length could have been sent. The amount of the uncertainty reduced by receiving a signal is Shannon's measure of the maximum amount of information that can be conveyed by that signal.

In other words, the measure of information conveyed involves comparison of a received signal with respect to possible signals that could have been sent. If there are more possible character types to choose from, or more possible characters in the string, there will be more uncertainty about which will be present where, and thus each will potentially carry more information. Similarly, if there are fewer possible characters, fewer characters comprising a given message, or if the probabilities of characters appearing are not equal, then each will be capable of conveying proportionately less information. Shannon's notion of entropy can be made quite precise for analysis of electronic transmission of signals and yet can also be generalized to cover quite mundane and diverse notions of possible variety. Shannon entropy is thus a measure of how much information these media can possibly carry. Because it is a logical, not a physical, measure, it is widely realizable. It applies as well to a page of text as to the distribution of objects in a room, or the positions that mercury can occupy in a thermometer. Since each object can assume any of a number of alternative positions, each possible configuration of the collection of objects is a potential sign.

What is "absent" for Deacon are all the unchosen alternative possibilities, without which no new information is created. Compare Stuart Kauffman's "adjacent possibles."
Shannon's analysis of information capacity provides another example of the critical role of absence. According to this way of measuring information, it is not intrinsic to the received communication itself; rather, it is a function of its relationship to something absent — the vast ensemble of other possible communications that could have been sent, but weren't. Without reference to this absent background of possible alternatives, the amount of potential information of a message cannot be measured. In other words, the background of unchosen signals is a critical determinant of what makes the received signals capable of conveying information. No alternatives = no uncertainty = no information.[Our emphasis] Thus Shannon measured the information received in terms of the uncertainty that it removed with respect to what could have been sent.

The analogy to thermodynamic entropy breaks down, however, because Shannon's concept is a logical (or structural) property, not a dynamical property. For example, Shannon entropy does not generally increase spontaneously in most communication systems, so there is no equivalent to the second law of thermodynamics when it comes to the entropy of information. The arrangement of units in a message doesn't spontaneously "tend" to change toward equiprobability. And yet something analogous to this effect becomes relevant in the case of real physically embodied messages conveyed by real mechanisms (such as a radio transmission or a computer network. In the real world of signal transmission, no medium is free from the effects of physical irregularities and functional degradation, an unreliability resulting from the physical effects of the second law.

So both notions of entropy are relevant to the concept of information, though in different ways. The Shannon entropy of a signal is the probability of receiving a given signal from among those possible; and the Boltzmann entropy of the signal is the probability that a given signal may have been corrupted.

A transmission affected by thermodynamic perturbations that make it less than perfectly reliable will introduce an additional level of uncertainty to contend with, but one that decreases information capacity. An increase in the Boltzmann entropy of the physical medium that constitutes the signal carrier corresponds to a decrease in the correlation between sent and received signals. Although this does not decrease the signal entropy, it reduces the amount of uncertainty that can be removed by a given signal. and thus reduces the information capacity.

This identifies two contributors to the entropy of a signal — one associated with the probability of a given signal being sent and the other associated with a given signal being corrupted. This complementary relationship is a hint that the physical and informational uses of the concept of entropy are more than merely analogous. By exploring the relationship between Shannon entropy and Boltzmann entropy, we can shed light on the reason why change in Shannon entropy is critical to information. But the connection is subtle, and its relation to the way that a signal conveys its information is even subtler.

Deacon adds something significant to his analysis of the two entropies and the connection to three levels of semiotics - syntax, semantics, and pragmatics (which he associates with Shannon, Boltzmann, and Darwin). He identifies three general rules about the nature of information and its relationship to the material-energetic processes on which it is dependent:
1) Information potential: Information is dependent on the physical features of a communication channel or (more generally) a sign medium and so the capacity of that channel or medium to assume different states (its maximum possible Shannon entropy) determines the maximum amount of information it can convey.

2) Physical basis of information: The Shannon entropy of a communication channel or sign medium is a function of the variety of states it can assume along with the degree of their causal independence from one another. This in turn can be described in terms of Boltzmann entropy.

3) Information as absence: The maximum potential information that a signal or sign can convey must be measured with respect to signals or signs that were not produced. It can only be defined and quantified with respect to the probability of these unrealized possibilities. Even in noisy conditions where an unreliable medium does not allow complete reduction of uncertainty from the maximum Shannon entropy, any degree of reduction provides a measurable level of information.

He says that the basis for the interdependence of Shannon and Boltzmann entropy can be stated in simple form as follows: a reduction of either Shannon or Boltzmann entropy does not tend to occur spontaneously, so when it does occur it is evidence of the intervention of an external influence. (Shannon-Boltzmann-Darwin: redefining information, Part 1, p.17) He summarizes the realtion between the entropies:
The analysis so far has exposed a common feature of both the logic of information theory (Shannon) and the logic of thermodynamic theory (Boltzmann). This not only helps explain the analogical use of the entropy concept in each and also explain why it is necessary to link these approaches into a common theory to begin to define the referential function of information. Both these formal commonalities and the basis for their unification into a theory for the referential ground of information depend on a focus on a dependence on a relationship to absence. In the case of classic information theory, the improbability of receiving a given sign or signal with respect to the background expectation of its receipt compared to other options defines the meaure of potential information. In the case of classic thermodynamics, the improbability of being in some far from equilibrium state is a measure of its potential to do work, and also a measure of work that was necessarily performed to shift it into this state.
Deacon on Free Will (from Incomplete Nature)
THE LOCUS OF AGENCY

Perhaps the most enigmatic feature of self its role as agent: as the locus and initiator of non-spontaneous physical changes in the world around it. This is often confused with the age-old problem of explaining the possibility of free will in a deterministic world. However, it is different in a number of important respects. Self as agent is indeed what philosophers struggling with the so-called free will paradox should be focused on, rather than freedom from determinate constraint. Determinate causality is in fact a necessary condition for the self to become the locus of physical work. An agent is a locus of work that is able to change things in ways more concordant with internally generated ends and contrary to extrinsic tendencies.

Approaching the self-dynamics of mental agency using this same framework, we need to look to the closure of the teleodynamic constraint generation process for the locus of the capacity to do self-initiated work. For the simplest autogenic process, this closure is constituted by a complex synergy between morphodynamic processes that makes possible both the generation of constraints and also their maintenance and replication. The teleodynamics that distinguishes the agency of organisms from mere physical work is a product of this closed reciprocity of form- (i.e., constraint-) generating processes. Specific forms of work are made possible by the imposition of specific forms of constraint, and the way this channels spontaneous change, via the expenditure of energy. So this defining dynamic of organisms amounts to the incessant generation of the capacity to perform specific forms of work to alter the surrounding milieu in ways that are determined by this locus of teleodynamics, irrespective of extrinsic causal tendencies. This persistent capacity to generate and maintain self-perpetuating constraints is therefore at the same time the creation of a locus of the capacity to do self-promoting work.

Free Will from The Symbolic Species
Such Stuff as Dreams Are Made On

Thirty spokes share the wheel's hub,
but it is the hole in the center that provides its usefulness.
—Lao Tsu, from the Tao Te Ching
Ends

As a species, we seem to be preoccupied with ends, in all senses of the word. We organize our actions around imagined extrapolations of the consequences they will produce. We struggle in vain to comprehend the implications of our own impending cessation of life. And we weave marvelously elaborate and beautifully obscure stories to fill our need to find purpose in the fabric of the universe. This fills no obvious adaptive need. Our evolution never included selection favoring anything like this intense and desperate drive. And yet it is so powerful as to be able to overcome some of the most irresistible predispositions that evolution has provided. If we are language savants compared to other species, then a preoccupation with ends is the special exaggerated compulsion that complements our unique gift.

Symbolic analysis is the basis for a remarkable new level of self-determination that human beings alone have stumbled upon. The ability to use virtual reference to build up elaborate internal models of possible futures, and to hold these complex visions in mind with the force of the mnemonic glue of symbolic inference and descriptive shorthands, gives us unprecedented capacity to generate independent adaptive behaviors. Remarkable abstraction from indexically bound experiences is further enhanced by the ability of symbolic reference to pick out tiny fragments of real world processes and arrange them as buoys to chart an inferential course that predicts physical and social events. The price we pay for this is that our symbolically mediated actions can often be in conflict with motivations to act that arise from more concrete and immediate biological sources. Arguments in support of the classic notion of free will frequently cite this capacity to use reason (that is, symbolic inference and model building) to overcome desire and compulsion. One might respond that calling some actions "free" and others not oversimplifies what is really only a matter of the degree of the strengths of competing compulsions to act, some compulsions arising from autonomic and hormonal sources and others from our imagined satisfaction at reaching a symbolized goal. But there is an important sense in which these competing compulsions are not equal.

Those that arise from purely physiological sources, or physiological sources mediated by conditioned associations, could be called bottom-up processes for producing action. They are much more tied to mechanism and thus exhibit few degrees of freedom and limited spontaneous variation. They are comparatively predictable, though any organismic process inevitably exhibits tangled paths of causality.

Although deterministic chaos and complexity theories do not generate "unconstrained and compulsion-free" alternative possibilities, Deacon seems to recognize the need for a "vast variety of alternatives" for action
But symbolically mediated compulsions to act are far more chaotic, in the technical sense of that word, far more susceptible to the influence of tiny initial differences in starting assumptions or ways of dividing up experiences and qualities symbolically. This is because symbolically mediated models of things — whether theories, stories, or just rationally argued predictions — exhibit complicated nonlinearity and recursive structure as well as nearly infinite flexibility and capacity for novelty due to their combinatorial nature. It is not so much that our actions arise from a totally unconstrained and compulsion-free center of intentions, but that the potential starting point, the intended purpose we have modeled, can be drawn from such a vast variety of alternatives with little initial difference in motive power.

Final causality, according to Aristotle, is exhibited when processes are driven not by antecedent physical conditions but by ends. In some ways this is like time reversed. In hindsight it is easy to infer that certain past conditions were necessitated by the way things turned out. Deductive inference is a lot like this sort of reflective inversion of temporal and physical order. The consequence is already implicitly included in the premises. In symbolic thinking, this results in what might be called a sort of symbolic compulsion. Certain statements compel certain others. Aristotle reserved another term for such compulsion — formal causality — but I think there is an important way that this links to the other, classic conception of cause in terms of the ways that symbols work. Little of our reasoning is so precise as to be called deductive, and yet the way that certain beliefs compel others can have nearly this force. Ideologies, religions, and just good explanations or stories thus exert a sort of inferential compulsion on us that is hard to resist because of their mutually reinforcing deductive and inductive links. Our end-directed behaviors are in this way often derived from such "compulsions" as are implicit in the form that underlies the flow of inferences. So one might say that thinking in symbols is a means whereby formal causes can determine final causes. The abstract nature of this source makes for a top-down causality, even if implemented on a bottom-up biological machine. Though the evolution of brains has been about systems for modeling and predicting events in the world, the evolution of symbolic abilities has not just amplified this ability far beyond that in any other species, it has also introduced an insidiously inverted modeling tendency. The svmbolic capacity seems to have brought with it a predisposition to project itself into what it models. The savant, instead of seeing a field of wildflowers. sees 247 flowers.

The human brain is the biological information processor par excellence, quite unlike the digital computer
Similarly, we don't just see a world of physical processes, accidents, reproducing organisms, and biological information processors churning out complex plans, desires, and needs. Instead, we see the handiwork of an infinite wisdom, the working out of a divine plan, the children of a creator, and a conflict between those on the side of good and those on the side of evil. We carry a nagging doubt about anything reallv being accidental. Co-incidence isn't just coincidence, it's a sign, and bad luck and disease don't just happen, perhaps a sorcerer has wished harm on the village. Wherever we look, we expect to find purpose. All things can be seen as signs and symbols of an all-knowing consciousness at work, or the marks of mythical events that occur in a dreamtime, behind the scenes of the universe. We are not just applying symbolic interpretations to human words and events; all the universe has become a symbol.

This is the evidence that we have become symbolic savants in the deeper sense of that metaphor. We are not just a species that uses symbols. The symbolic universe has ensnared us in an inescapable web. Like a "mind virus," the symbolic adaptation has infected us, and now by virtue of the irresistible urge it has instilled in us to turn everything we encounter and everyone we meet into symbols, we have become the means by which it unceremoniously propagates itself throughout the world.

It is clear that we feel more comfortable in a world that is meaningful, living a life that has meaning. The alternative is somehow too frightening. But why? Why should the ability to acquire symbolic abilities and conceive of things symbolically also bring with it a powerful urge to see it in every conceivable context? It could be seen as part of the predisposition to acquire symbols in the first place, part of the overdesign of the mind to ensure that symbols get discovered. But I think it may be a more mundane feature of cognitive and sensorimotor biases in general. The autistic savant is in this way no different from the kitten that sees every small mobile object as a representative prey toy, or the baby who interacts with every holdable object as a thing to be put into the mouth — for reasons that probably flow ineluctably from the Darwinian-competitive structure of neural information processing. Brains are spontaneously active biological computers in which activity patterns incessantly compete for wider expression throughout each network. Under these conditions, the dominant operation simply runs on its own and assimilates whatever is available. In us, this appears to be the expression of what I have called front-heavy cognition, driven by an overactive, busybody prefrontal cortex. It gets expressed as a need to recode our experiences, to see everything as a representation, to expect there to be a deeper hidden logic. Even when we don't believe in it, we find ourselves captivated by the lure of numerology, astrology, or the global intrigue of conspiracy theories. This is the characteristic expression of a uniquely human cognitive style; the mark of a thoroughly symbolic species.

One of the essentially universal attributes of human culture is what might be called the mystical or religious inclination. There is no culture I know of that lacks a rich mythical, mystical, and religious tradition. And there is no culture that doesn't devote much of this intense interpretive enterprise to struggling with the very personal mystery of mortality. Knowledge of death, of the inconceivable possibility that the experiences of life will end, is a datum that only symbolic representation can impart. Other species may experience loss, and the pain of separation, and the difficulty of abandoning a dead companion; yet without the ability to represent this abstract counterfactual (at least for the moment) relationship, there can be no emotional connection to one's own future death. But this news, which all children eventually discover as they develop their symbolic abilities, provides an unbidden opportunity to turn the naturally evolved social instinct of loss and separation in on itself to create a foreboding sense of fear, sorrow, and impending loss with respect to our own lives, as if looking back from an impossible future. No feature of the limbic system has evolved to handle this ubiquitous virtual sense of loss. Indeed, I wonder if this isn't one of the most maladaptive of the serendipitous consequences of the evolution of symbolic abilities. What great efforts we exert trying to forget our future fate by submerging the constant angst with innumerable distractions, or trying to convince ourselves that the end isn't really what it seems by weaving marvelous alternative interpretations of what will happen in "the undiscovered country" on the other side of death.

In many ways this is the source both of what is most noble and most pathological in human behaviors. Supported by these interpretations, reason can recruit the strength to face the threat of emptiness in the service of shared values and aspirations. But the dark side of religious belief and powerful ideology is that they so often provide twisted justifications for arbitrarily sparing or destroying lives. Their symbolic power can trap us in a web of oppression, as we try through ritual action and obsessive devotion to a cause to maintain a psychic safety net that protects us from our fears of purposelessness. The interaction of symbolic cultural evolution and unprepared biology has created some of the most influential and virulent systems of symbols the world has ever known. Few if any societies have ever escaped the grip of powerful beliefs that cloak the impenetrable mystery of human life and death in a cocoon of symbolism and meaning. The history of the twentieth century, like all those recorded before it, is sadly written in the blood that irreconcilable symbol systems have spilt between them. Perhaps this is because the savantlike compulsion to see symbols in everything reaches its most irresistible expression when it comes to the symbolization of our own lives' end. We inevitably imagine ourselves as symbols, as the tokens of a deeper discourse of the world. But symbols are subject to being rendered meaningless by contradiction, and this makes alternative models of the world direct threats to existence.

Almost certainly this is one of the other defining features of the human mentality: an ever present virtual experience of our own loss. And yet we know so little about what it is that we fear to lose. Perhaps if we understood this symbolic compulsion, and the consciousness it brings with it, we might find this emptiness at the center a bit less disturbing.

Deacon's Glossary

A vital tool needed to understand the book Incomplete Nature.

Absential: The paradoxical intrinsic property of existing with respect to something missing, separate, and possibly nonexistent. Although this property is irrelevant when it comes to inanimate things, it is a defining property of life and mind; elsewhere (Deacon 2005) described as a constitutive absence.

Attractor: An attractor is a "region" within the range of possible states that a dynamical system is most likely to be found within. The behavior of a dynamical system is commonly modeled as a complex "trajectory of states leading to states" within a phase space (typically depicted as a complex curve in a multidimensional graph). The term is used here to describe one or more of the quasi-stable regions of dynamics that a dynamical system will asymmetrically tend toward. Dynamical attractors include state of equilibrium of a thermodynamic system, the self-organized global regularity converged upon by a morphodynamic process, or the metabolic maintenance and developmental trajectory of an organism (a teleodynamic system). An attractor does not "attract" in the sense of a field of force; rather it is the expression of an asymmetric statistical tendency.

Autocatalysis: A set of chemical reactions can be said to be "collectively autocatalytic" if a number of those reactions produce, as reaction products, catalysts for enough of the other reactions that the entire set of chemical reactions is self-sustaining, given an input of energy and substrate molecules. This has the effect of producing a runaway increase in the molecules of the autocatalytic set at the expense of other molecular forms, until all substrates are exhausted

Autocell: A minimal molecular teleodynamic system (termed an autogen in this book), consisting of mutually reinforcing autocatalytic process and a molecular self-assembly process, first described in Deacon 2006a.

Autogen: A self-generating system at the phase transition between morphodynamics and teleodynamics; any form of self-generating, self-repairing, self-replicating system that is constituted by reciprocal morphodynamic processes

Autogenic: Adjective describing any process involving reciprocally reinforcing morphodynamic processes that thereby has the potential to self-reconstitute and/or reproduce

Autogeneses: The combination of self-generation, self-repair, self-replication capacities that is made possible by teleodynamic organization; the process by which reciprocally reinforcing morphodynamic processes become a selfgenerating autogen?

Boltzmann entropy: A term used in this work to indicate the traditional entropy of thermodynamic processes. It is distiguished from "entropy" as defined by Claude Shannon for use in information theory

Casimir effect: When two metallic plates are placed facing each other a small distance apart in a vacuum, an extremely tiny attractive force can be measured between them. Quantum field theory interprets this as the effect of fluctuating electromagnetic waves that are present even in empty space

Chaos theory: A field of study in applied mathematics that studies the behavior of dynamical systems that tend to be highly sensitive to initial conditions; a popular phrase for this sensitivity is the "butterfly effect." Although such systems can be completely deterministic, they become increasingly unpredictable over time. This is often described as deterministic chaos. Though unpredictable in detail, such systems may nevertheless exhibit considerable constraint in their trajectories of change. These constrained trajectories are often described as attractors

Complexity theory: A field of study in applied mathematics concerned with systems of high-dimensionality in structure or dynamics, such as those generated by non-linear processes and recursive algorithms, and including systems exhibiting deterministic chaos. The intention is to find ways to model physical and biological systems that have otherwise been difficult to analyze and model

Constitutive absence: A particular and precise missing something that is a critical defining attribute of "ententional" phenomena, such as functions, thoughts, adaptations, purposes, and subjective experiences.

Constraint: The state of being restricted or confined within prescribed bounds. Constraints are what is not there but could have been. The concept of constraint is, in effect, a complementary concept to order, habit, and organization because something that is ordered or organized is restricted in its range and or dimensions of variation, and consequently tends to exhibit redundant features or regularities. A dynamical system is constrained to the extent that it is restricted in degrees of freedom to change and exhibits attractor tendencies. Constraints can originate intrinsic or extrinsic to the system that is thereby constrained

Contragrade: Changes in the state of a system that must be extrinsically forced because they run counter to orthograde (aka spontaneous) tendencies

Cybernetics: A discipline that studies circular causal systems, where part of the effect of a chain of causal events returns to influence causal processes further back up the chain. Typically, a cybernetic system moves from action, to sensing, to comparison with a desired goal, and again to action

Eliminative materialism: The assumption that all reference to ententional phenomena can and must be eliminated from our scientific theories and replaced by accounts of material mechanisms

Emergence: A term used to designate an apparently discontinuous transition from one mode of causal properties to another of a higher rank, typically associated with an increase in scale in which lower-order component interactions contribute to the lower-order interactions. The term has a long and diverse history, but throughout this history it has been used to describe the way that living and mental processes depend upon chemical and physical processes, yet exhibit collective properties exhibited by living and non-mental processes, and in many cases appear to violate the ubiquitous tendencies exhibited by these component interactions

Emergent dynamics: A theory developed in this book which explains how homeodynamic (e.g., thermodynamic) processes can give rise to morphodynamic (e.g., self-organizing) processes, which can give rise to teleodynamic (e.g., living and mental) processes. Intended to legitimize scientific uses of ententional (intentional, purposeful, normative) concepts by demonstrating the way that processes at a higher level in this hierarchy emerge from, and are grounded in, simpler physical processes, but exhibit reversals of the otherwise ubiquitous tendencies of these lower-level processes

Entelechy: A term Aristotle coined for a non-perceptible principle in organisms leading to full actualization of what was merely potential. It is responsible for the growth of the embryo into an adult of its species, and for the maintenance of the organism's species-specific activities as an adult

Ententional: A generic adjective coined in this book for describing all phenomena that are intrinsically incomplete in the sense of being in relationship to, constituted by, or organized to achieve something non-intrinsic. This includes function, information, meaning, reference, representation, agency, purpose, sentience, and value

Epiphenomenal: Something is epiphenomenal if it is causally irrelevant and therefore just a redescription of more fundamental physical phenomena that are responsible for all that the causal powers mistakenly attribute to the epiphenomenal feature

Functionalism: The idea that the organization of a process can have real causal efficacy in the world, independent of the specific material components that constitute it. Thus a computer algorithm can exhibit the same global causal consequences despite being run on quite different computer architectures

Fusion: A conception of emergence proposed by the philosopher Paul Humphreys, which argues that lower-level components and dynamics merge in indecomposable ways in the emergence of higher-order phenomena. It is especially relevant to the transition from quantum to classical processes. A related concept is discussed in terms of the reciprocal co-creation of biomolecules that compose an organism body

Golem: In Jewish folklore a golem is an animated, anthropomorphic being, created entirely from inanimate matter but lacking a soul

Homeodynamics: Any dynamic process that spontaneously reduces a system's constraints to their minimum and thus more evenly distributes system properties across space and time. The second law of thermodynamics describes the paradigm case

Homunculus: Any tiny or cryptic humanlike form or creature, something slightly less than human, though exhibiting certain human attributes. In recent scientific literature, "homunculus" has also come to mean the misuse of teleological assumptions: the unacknowledged gap-fillers that stand behind, outside, or within processes involving apparent teleological processes, such as many features of life and mind, and pretend to be explanations of their function

Intentional: In common usage, an adjective describing an act that is performed on purpose. Technically, in twentieth-century philosophy of mind, it is a term deriving from the medieval Scholastics, reintroduced by the German philosopher Brentano, to designate a characteristic common to all sensations, ideas, thoughts, and desires: the fact that they are "about" something other than themselves

Mereology: Literally, the "study of partness"; in practice, the study of compositionality relationships and their related hierarchic properties

Morphodynamics: Dynamical organization exhibiting the tendency to become spontaneously more organized and orderly over time due to constant perturbation, but without the extrinsic imposition of influences that specifically impose that regularity

Multiple realizability: When the same property can be produced by diverse means; independence of certain phenomena from any of their specific constitutive material details (see also Functionalism)

Nominalism: The assumption that generalizations are merely conveniences of thought, abstracted from observation, and otherwise epiphenomenal in the world of physical cause and effect; thus a denial of the efficacy of types, classes, species, ideal forms and general properties over and above that of the individuals they describe

Orthograde: Changes in the state of a system that are consistent with the spontaneous, "natural" tendency to change, without external interference

Panpsychism: The assumption that a vestige of mental phenomenology is present in every physical event, and therefore suffused throughout the cosmos. Although panpsychism is not as influential today, and effectively plays no role in modern cognitive neuroscience, it still attracts a wide following, mostly because of a serendipitous compatibility with certain interpretations of quantum physics

Phase space: In mathematics and physics, a phase space is a space in which all possible states of a system are represented. Each possible state of the system corresponds to one unique point in the phase space. For mechanical systems, a phase space usually consists of all possible values of position and momentum,

Preformationism: Narrowly, the assumption that the human physique was preformed from conception. More broadly as used here, the assumption that ententional phenomena were performed in antecedent phenomena— that, for example, language is preformed in a universal grammar module, information is preformed in DNA, or that consciousness is preformed in the mind of God

Protected states: Insulation between levels of dynamics, in effect, micro differences that don't make a macro difference because of statistical smoothing and attractor dynamics. Introduced by the physicist Robert Laughlin to describe the causal insulation of physical processes at different levels of scale

Protocell: Any of a number of theoretical, membrane-bound multimolecular units conceived by molecular biologists as experimental or theoretical simplest possible living units, usually consisting of replicating polynucleotides within a lipid "bubble," used as possible exemplars of the precursors of life

Realism: The assumption that general properties, laws, and physical dispositions to change are fundamental facts about reality, irrespective of subjective experience, and are causally efficacious

Self-organization (Self-simplification): W. Ross Ashby (1957) defined a self-organizing system as one that spontaneously reduces its statistical entropy, but not necessarily its thermodynamic entropy, by reducing the number of its potential states. Ashby equated self-organization with self-simplification. In parallel, Ilya Prigogine explored how such phenomena can be generated by constantly changing physical and chemical conditions, thereby continually perturbing them away from equilibrium. This work augmented the notion of self-organization by demonstrating that it is a property common to many far-from-equilibrium processes; systems that Prigogine described as dissipative structures

Shannon entropy: A measure of the variety of possible signal configurations of a communication medium determined as proportional to the logarithm of the number of possible states of the medium. This is an entirely general quantity that can be applied to almost any phenomenon. Designating it as entropy, though initially due to its mathematical parallel with thermodynamic entropy, is now generally thought to be describing the same thing in informational terms

Shannon information: A way of measuring the information-carrying capacity of a medium in terms of the uncertainty that a received signal removes

Strong emergentism: The argument that emergent transitions involve a fundamental discontinuity of physical laws — cf. Weak emergentism

Supervenience: The relationship that emergent properties have to the base properties that give rise to them

Teleodynamics: A form of dynamical organization exhibiting end-directedness and consequence-organized features that is constituted by the co-creation, complementary constraint, and reciprocal synergy of two or more strongly coupled morphodynamic processes

Teleogen: A non-autonomous autogenically organized system that is a component within a larger autogenic system, such as a somatic cell within a multicellular organism or an endosymbiotic organism within an organism. Although such subordinate or lower-order nearly autogenic subsystems are not fully reciprocally closed in their dynamics, they nevertheless exhibit end-directed tendencies and normative relationships with respect to extrinsic factors

Teleogenic: A systemic property (or individuated dynamical system) constituted by a higher-order form of teleodynamic process, specifically where that teleodynamic process includes a self-referential loop of causality such that the causal properties of the individuated teleodynamic unit are re-presented in some form in the generation of teleodynamic adaptive processes

Teleomatic: Automatically achieving an end, as when a thermodynamic system develops toward equilibrium or gravity provides the end state for a falling rock

Teleonomy (Teleonomic): Teleological in name only. A terminological distinction that would exemplify a middle ground between mere mechanism and purpose, behavior predictably oriented toward a particular target state even in systems where there was no explicit representation of that state or intention to achieve it. [This term was invented by Colin Pittendrigh, to distinguish it from teleology, and was used by Jacques Monod and Ernst Mayr. It is closely related to Aristotle's term "entelechy."]

Teleological (Teleology): Purposive, or end-directed (the study of such relationships). Philosophically related to Aristotle's concept of a "final cause"

Top-down causality: The notion that higher-order emergent phenomena can alter phenomena that they supervene upon (i.e., the components and interactions that collectively have given rise to the emergent phenomena). Usually proposed as a countervailing causal claim to the reductionist assumption that macro events and properties are entirely determined by the micro events and properties of components that compose them. For example, some have argued that whole brain functions, which are the product of billions of neural interactions, can alter the way individual neurons behave, and thus generate causal consequences at the neuronal level. As a temporally understood causal relationship, this is not problematic; but understood synchronically, it appears to lead to vicious regress. Anther way of understanding top-down causality, due to Roger Sperry (see chapter 5), is as global constraint. Thus atoms in a wheel are constrained to only move with respect to neighboring atoms; but if the whole wheel rolls, all the atoms are caused to follow cycloid paths of movement

Tychism: The metaphysical assumption that, at base, change is spontaneous and singular, and thus intrinsically uncorrelated

Vitalism: A theory in natural philosophy claiming that physical and chemical processes alone are insufficient to explain living organisms. An additional non-perceptible factor is necessary which Hans Driesch (1929) called entelechy, to honor Aristotle, and Henri Bergson (1907) called élan vital. For Driesch, in its earliest stage an embryo is not manifold in an extensive sense, but there is present in it an entelechy which is "an intensive manifoldness" -

Weak emergentism: The argument that although in emergent transitions there may be a superficially radical reorganization, the properties of the higher and lower levels form a continuum, with no new laws of causality emerging. Often associated with epistemological emergentism because it is attributed to incomplete knowledge of the critical causality


Selected Papers
Reference and Significance of Information (2013)

Prolegomenon for a formal theory of referential information (2014)

Complexity and Dynamical Depth (2012)

Complexity and Dynamical Depth (2014)

Shannon - Boltzmann — Darwin: Redefining information (Part I)

Shannon - Boltzmann — Darwin: Redefining information (Part II)

What Is Missing from Theories of Information

Exploring Constraint: Simulating Self-Organization and Autogenesis in the Autogenic Automaton

The transition from constraint to regulation at the origin of life

Papers at Deacon's 2015 Biosemiotics Workshop
Steps to a Science of Semiotics

His Slides

Prolegomenon for a Formal Theory

Information Philosopher review of Incomplete Nature for BioScience (2012)

Information Philosopher Presentation on Biosemiotics

For Teachers
For Scholars
inexist, To exist within.
inexistent, [LL.inexistens. See IN -in; EXISTENT.] Inherent; innate: indwelling.
inexistent, adj. [in -not +existent.] Not having being; not existing.


Chapter 1.5 - The Philosophers Chapter 2.1 - The Problem of Knowledge
Home Part Two - Knowledge
Normal | Teacher | Scholar