Citation for this page in APA citation style.           Close


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
F.H.Bradley
C.D.Broad
Michael Burke
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Herbert Feigl
John Martin Fischer
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Andrea Lavazza
Keith Lehrer
Gottfried Leibniz
Leucippus
Michael Levin
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
E. Jonathan Lowe
John R. Lucas
Lucretius
Ruth Barcan Marcus
James Martineau
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

Michael Arbib
Bernard Baars
Gregory Bateson
John S. Bell
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Donald Campbell
Anthony Cashmore
Eric Chaisson
Jean-Pierre Changeux
Arthur Holly Compton
John Conway
John Cramer
E. P. Culverwell
Charles Darwin
Terrence Deacon
Louis de Broglie
Max Delbrück
Abraham de Moivre
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Paul Ehrenfest
Albert Einstein
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
Joseph Fourier
Lila Gatlin
Michael Gazzaniga
GianCarlo Ghirardi
J. Willard Gibbs
Nicolas Gisin
Paul Glimcher
Thomas Gold
A.O.Gomes
Brian Goodwin
Joshua Greene
Jacques Hadamard
Patrick Haggard
Stuart Hameroff
Augustin Hamon
Sam Harris
Hyman Hartman
John-Dylan Haynes
Martin Heisenberg
Werner Heisenberg
John Herschel
Jesper Hoffmeyer
E. T. Jaynes
William Stanley Jevons
Roman Jakobson
Pascual Jordan
Ruth E. Kastner
Stuart Kauffman
Simon Kochen
Stephen Kosslyn
Ladislav Kovàč
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
David Layzer
Benjamin Libet
Seth Lloyd
Hendrik Lorentz
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
James Clerk Maxwell
Ernst Mayr
Ulrich Mohrhoff
Jacques Monod
Emmy Noether
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Roger Penrose
Steven Pinker
Colin Pittendrigh
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Adolphe Quételet
Juan Roederer
Jerome Rothstein
David Ruelle
Erwin Schrödinger
Aaron Schurger
Claude Shannon
David Shiang
Herbert Simon
Dean Keith Simonton
B. F. Skinner
Roger Sperry
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
William Thomson (Kelvin)
Peter Tse
Heinz von Foerster
John von Neumann
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
H. Dieter Zeh
Ernst Zermelo
Wojciech Zurek

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
Norbert Wiener

Norbert Wiener created the modern field of control and communication systems, utilizing concepts like negative feedback. His seminal 1948 book Cybernetics both defined and named the new field.

In the book, Wiener helped define the new quantitative concept of information coming out of the work of John von Neumann, Alan Turing, and Claude Shannon on computers and communications.

As Leo Szilard had done twenty years earlier, Wiener emphasized the information gained in choices between two equiprobable alternatives, which produce "bits" (binary digits) of information.

One and all, time series [of experimental data] and the apparatus to deal with them, whether in the computing laboratory or in the telephone circuit, have to deal with the recording, preservation, transmission, and use of information. What is this information, and how is it measured? One of the simplest, most unitary forms of information is the recording of a choice between two equally probable simple alternatives, one or the other of which is bound to happen — a choice, for example, between heads and tails in the tossing of a coin. We shall call a single choice of this sort a decision. If then we ask for the amount of information in the perfectly precise measurement of a quantity known to lie between A and B, which may with uniform a priori probability lie anywhere in this range...

We may conceive this in the following way: we know a priori that a variable lies between 0 and 1, and a posteriori that it lies on the interval (a, b) inside (0, 1). Then the amount of information we have from our a posteriori knowledge is

-log2 (measure of (a, b) / measure of (0, 1))

Wiener's negative of the entropy led Leon Brillouin to coin the term negentropy
The quantity we here define as amount of information is the negative of the quantity usually defined as entropy in similar situations. The definition here given is not the one given by R. A. Fisher for statistical problems, although it is a statistical definition; and can be used to replace Fisher's definition in the technique of statistics.

Wiener compared the information processing in the computers of his day to the human mind and found them both wasteful of energy. And he argued that information is neither matter nor energy.
As a final remark, let me point out that a large computing machine, whether in the form of mechanical or electric apparatus or in the form of the brain itself, uses up a considerable amount of power, all of which is wasted and dissipated in heat. The blood leaving the brain is a fraction of a degree warmer than that entering it. No other computing machine approaches the economy of energy of the brain. In a large apparatus like the Eniac or Edvac, the filaments of the tubes consume a quantity of energy which may well be measured in kilowatts, and unless adequate ventilating and cooling apparatus is provided, the system will suffer from what is the mechanical equivalent of pyrexia, until the constants of the machine are radically changed by the heat, and its performance breaks down. Nevertheless, the energy spent per individual operation is almost vanishingly small, and does not even begin to form an adequate measure of the performance of the apparatus. The mechanical brain does not secrete thought "as the liver does bile," as the earlier materialists claimed, nor does it put it out in the form of energy, as the muscle puts out its activity. Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.

Information philosophy agrees that "information is neither matter nor energy," but it needs matter for its embodiment and energy for its communication.

In his book "The Human Use of Human Beings," the Cybernetics founder saw the Devil himself increasing entropy everywhere. But he also saw the flow of negative entropy from the Sun that is the source of all life and mind on the Earth.

The Maxwell demon

We are immersed in a life in which the world as a whole obeys the second law of thermodynamics: confusion increases and order decreases. Yet, as we have seen, the second law of thermodynamics, while it may be a valid statement about the whole of a closed system, is definitely not valid concerning a non-isolated part of it.
Processes that decrease the entropy locally we call "ergodic"
There are local and temporary islands of decreasing entropy in a world in which the entropy as a whole tends to increase, and the existence of these islands enables some of us to assert the existence of progress. What can we say about the general direction of the battle between progress and increasing entropy in the world immediately about us?

In physics, the idea of progress opposes that of entropy, although there is no absolute contradiction between the two. In the forms of physics directly dependent on the work of Newton, the information which contributes to progress and is directed against the increase of entropy may be carried by extremely small quantities of energy, or perhaps even by no energy at all. This view has been altered in the present century by the innovation in physics known as quantum theory.

Quantum theory has led, for our purposes, to a new association of energy and information. A crude form of this association occurs in the theories of line noise in a telephone circuit or an amplifier. Such background noise may be shown to be unavoidable, as it depends on the discrete character of the electrons which carry the current; and yet it has a definite power of destroying information. The circuit therefore demands a certain amountof communication power in order that the message may not be swamped by its own energy. More fundamental than this example is the fact that light itself has an atomic structure, and that light of a given frequency is radiated in lumps which are known as light quanta, which have a determined energy dependent on that frequency.

Thus there can be no radiation of less energy than a single light quantum. The transfer of information cannot take place without a certain expenditure of energy, so that there is no sharp boundary between energetic coupling and informational coupling. Nevertheless, for most practical purposes, a light quantum is a very small thing; and the amount of energy transfer which is necessary for an effective informational coupling is quite small. It follows that in considering such a local process as the growth of a tree or of a human being, which depends directly or indirectly on radiation from the sun, an enormous local decrease in entropy may be associated with quite a moderate energy transfer. This is one of the fundamental facts of biology; and in particular of the theory of photosynthesis, or of the chemical process by which a plant is enabled to use the sun's rays to form starch, and other complicated chemicals necessary for life, out of the water and the carbon dioxide of the air.

Thus the question of whether to interpret the second law of thermodynamics pessimistically or not depends on the importance we give to the universe at large, on the one hand, and to the islands of locally decreasing entropy which we find in it, on the other. Remember that we ourselves constitute such an island of decreasing entropy, and that we live among other such islands. The result is that the normal prospective difference between the near and the remote leads us to give far greater importance to the regions of decreasing entropy and increasing order than to the universe at large. For example, it may very well be that life is a rare phenomenon in the universe; confined perhaps to the solar system, or even, if we consider life on any level comparable to that in which we are principally interested, to the earth alone. Nevertheless, we live on this earth, and the possible absence of life elsewhere in the universe is of no great concern to us, and certainly of no concern proportionate to the overwhelming size of the remainder of the universe.

What I say about the need for faith in science is equally true for a purely causative world and for one in which probability rules. No amount of purely objective and disconnected observation can show that probability is a valid notion. To put the same statement in other language, the laws of induction in logic cannot be established inductively. Inductive logic, the logic of Bacon, is rather something on which we can act than something which we can prove, and to act on it is a supreme assertion of faith. It is in this connection that I must say that Einstein's dictum concerning the directness of God is itself a statement of faith. Science is a way of life which can only flourish when men are free to have faith. A faith which we follow upon orders imposed from outside is no faith, and a community which puts its dependence upon such a pseudo-faith is ultimately bound to ruin itself because of the paralysis which the lack of a healthily growing science imposes upon it
On Free Will
The succession of names Maxwell-Boltzmann-Gibbs represents a progressive reduction of thermodynamics to statistical mechanics: that is, a reduction of the phenomena concerning heat and temperature to phenomena in which a Newtonian mechanics is applied to a situation in which we deal not with a single dynamical system but with a statistical distribution of dynamical systems; and in which our conclusions concern not all such systems but an overwhelming majority of them. About the year 1900, it became apparent that there was something seriously" wrong with thermodynamics, particularly where it concerned radiation. The ether showed much less power to absorb radiations of high frequency—as shown by the law of Planck—than any existing mechanization of radiation theory had allowed. Planck gave a quasi-atomic theory of radiation—the quantum theory—which accounted satisfactorily enough for these phenomena, but which was at odds with the whole remainder of physics; and Niels Bohr followed this up with a similarly ad hoc theory of the atom. Thus Newton and Planck-Bohr formed, respectively, the thesis and antithesis of a Hegelian antinomy. The synthesis is the statistical theory discovered by Heisenberg in 1925, in which the statistical Newtonian dynamics of Gibbs is replaced by a statistical theory very similar to that of Newton and Gibbs for large-scale phenomena, but in which the complete collection of data for the present and the past is not sufficient to predict the future more than statistically. It is thus not too much to say that not only the Newtonian astronomy but even the Newtonian physics has become a picture of the average results of a statistical situation, and hence an account of an evolutionary process. This transition from a Newtonian, reversible time to a Gibbsian, irreversible time has, had its philosophical echoes. Bergson emphasized the difference between the reversible time of physics, in which nothing new happens, and the irreversible time of evolution and biology, in which there is always something new. The realization that the Newtonian physics was not the proper frame for biology was perhaps the central point in the old controversy between vitalism and mechanism; although this was complicated by the desire to conserve in some form or other at least the shadows of the soul and of God against the inroads of materialism. In the end, as we have seen, the vitalist proved too much. Instead of building a wall between the claims of life and those of physics, the wall has been erected to surround so wide a compass that both matter and life find themselves inside it.
Newton is deterministic, Gibbs is statistical, but still deterministic, just unpredictable.
Ananke (necessity) and Tyche (chance) are the standard argument against free will.
It is true that the matter of the newer physics is not the matter of Newton, but it is something quite as remote from the anthropomorphizing desires of the vitalists. The chance of the quantum theoretician is not the ethical freedom of the Augustinian, and Tyche is as relentless a mistress as Ananke.
References

"Progress and Entropy," chapter 2 from The Human Use of Human Beings

For Teachers
For Scholars

Chapter 1.5 - The Philosophers Chapter 2.1 - The Problem of Knowledge
Home Part Two - Knowledge
Normal | Teacher | Scholar