Citation for this page in APA citation style.           Close


Core Concepts

Abduction
Belief
Best Explanation
Cause
Certainty
Chance
Coherence
Correspondence
Decoherence
Divided Line
Downward Causation
Emergence
Emergent Dualism
ERR
Identity Theory
Infinite Regress
Information
Intension/Extension
Intersubjectivism
Justification
Materialism
Meaning
Mental Causation
Multiple Realizability
Naturalism
Necessity
Possible Worlds
Postmodernism
Probability
Realism
Reductionism
Schrödinger's Cat
Supervenience
Truth
Universals

Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
F.H.Bradley
C.D.Broad
Michael Burke
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
James Martineau
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
Arthur Schopenhauer
John Searle
Wilfrid Sellars
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Teilhard de Chardin
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Gregory Bateson
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Donald Campbell
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Arthur Holly Compton
John Conway
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Albert Einstein
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Lila Gatlin
Michael Gazzaniga
GianCarlo Ghirardi
J. Willard Gibbs
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Hyman Hartman
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Art Hobson
Jesper Hoffmeyer
E. T. Jaynes
William Stanley Jevons
Roman Jakobson
Pascual Jordan
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
David Layzer
Joseph LeDoux
Benjamin Libet
Seth Lloyd
Hendrik Lorentz
Josef Loschmidt
Ernst Mach
Donald MacKay
Henry Margenau
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Emmy Noether
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Roger Penrose
Steven Pinker
Colin Pittendrigh
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Adolphe Quételet
Jürgen Renn
Juan Roederer
Jerome Rothstein
David Ruelle
Tilman Sauer
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Claude Shannon
Charles Sherrington
David Shiang
Herbert Simon
Dean Keith Simonton
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
William Thomson (Kelvin)
Giulio Tononi
Peter Tse
Vlatko Vedral
Heinz von Foerster
John von Neumann
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
John Wheeler
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Stephen Wolfram
H. Dieter Zeh
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium

 
Knowledge
Metaphysics is the study of what there is, what exists, and how we know that it exists. The ancients described it as the problem of Being. We cannot know what there is without knowing how we can know anything.

Knowing how we know is the sub-discipline of metaphysics called epistemology. What there is is the study of ontology.

Knowing how we know is a fundamentally circular problem when it is described in human language, normally as a set of logical propositions. And knowing something about what exists adds another complex circle, if the knowing being must itself be one of those things that exists.

These circular definitions and inferences need not be vicious circles. They may simply be a coherent set of ideas that we use to describe ourselves and the external world. If the descriptions are logically valid and/or verifiable empirically, we think we are approaching the "truth" about things and acquiring knowledge.

How then do we describe the knowledge itself - as an existing thing in our existent minds and in the existing external world. Information philosophy does it by basing everything on the abstract but quantitative notion of information.

All information in the universe is created by a single two-step process. We call it the cosmic creation process.

In the first step, something new and different is created at random. If it was determined by the past, it would not be new information. New information is a local reduction in the entropy.

To satisfy the second law of thermodynamics, positive entropy must travel away from the new information structure, to the sink of the expanding universe. This is the second step. If it fails, the new information structure is destroyed, returning to its prior equilibrium state.

Information (or negative entropy) lies in the arrangement of matter. Boltzmann's definition of entropy is the logarithm of the number of microscopic arrangements of matter that are consistent with the macroscopic properties (the thermodynamics). S = k lnW.

For the first nine billion years or so, information structures were created by known physical forces like the combination of elementary particles into subatomic particles, then into atoms. Eventually gravitation condensed these randomly distributed atoms into astrophysical objects like the planets, stars, and galaxies.

Then, on our planet, some complex molecules accidentally began to replicate themselves. Random accidents in replication began the process of biological evolution. Some very complicated replicants began to share information about their neighbors, perhaps by atoms or small molecules secreted by them into the environment. This was the beginning of communications between information structures.

This is the beginning of knowledge.

Information is stored or encoded in physical and biological structures. Structures in the world build themselves, following natural laws, including physical and biological laws. Structures in the mind are partly built by biological processes and partly built by intelligence, which is free, creative, and unpredictable.

For information philosophy, knowledge is information created and stored in minds and in human artifacts like stories, books, and internetworked computers.

Information is neither matter nor energy, though it requires matter for its embodiment and energy for its communication.

Knowledge is actionable information that forms the basis for thoughts and actions, by the higher animals and humans.

Knowledge includes all the cultural information created by human societies. We call it the Sum. It includes the theories and experiments of scientists, who collaborate to establish our knowledge of the external world. Scientific knowledge comes the closest of any knowledge to being independent of any human mind, though it is still dependent on an open interdependent community of fundamentally subjective inquirers.

To the extent of the correspondence, the isomorphism, the one-to-one mapping, between information structures (and processes) in the world and representative structures and functions in the mind, information philosophy claims that we as individuals have quantifiable personal or subjective knowledge of the world.

To the extent of the agreement (again a correspondence or isomorphism) between information in the minds of an open community of inquirers seeking the best explanations for phenomena, information philosophy further claims that we have quantifiable inter-subjective knowledge of other minds and of an external world. Although science depends on their inter-subjective agreement, this is as close as we come to "objective" knowledge, to knowledge of objects, the Kantian "things in themselves." Empiricists like John Locke thought "primary" qualities of objects are inaccessible. He believed our senses are only able to receive "secondary" qualities. Information philosophy makes this a distinction without a difference.

Analytic language philosophers have a much narrower definition of knowledge. They identify it with language, logic, and human beliefs. For them, epistemology has been reduced to the "truth" of statements and propositions that can be logically analyzed and validated.

Epistemologists say persons have knowledge only 1) if a statement is true, 2) if they believe that a statement is true, and 3) if their belief is "justified," where justification may be because their belief was the consequence of a "reliable" cognitive process, or because the belief was "caused" by facts in the world about the belief.

They trace their three-step conditions for knowledge back to Plato's Theaetetus and Aristotle's Posterior Analytics. Plato did talk about opinions, which could be true or false. The true or "right" opinions could be further supported by giving an "account" of the reasons why an opinion is "true" and not "false." But like many Platonic dialogues, there was no resolution or agreement in the Theaetetus that these three elements could indeed produce knowledge. The Greek word Plato used for knowledge was episteme, which translates more nearly as "know how" than the "know that" associated with knowledge of the "facts" in propositions.

Our English word for knowledge comes from the Indo-European and later Greek gno as in gnosis. In Greek it meant a mark or token that was familiar and immediately recognizable, with an act of cognition or cognizance. It gives us the word ken (our close relatives are "kin"), the German cognate kennen, and the French connaisance.

Bertrand Russell distinguished "knowledge by acquaintance" as immediate (viz. non-mediated) direct awareness of a particular thing. He contrasted such basic knowledge with knowledge of concepts, ideas or "universals," which can be used to describe many particular things. He called this "knowledge by description." He included the sense data of "red, here, now" in immediate knowledge, knowledge we less likely to doubt and that serves as a logical foundation.

All this works well for one idea of knowledge, but unfortunately for analytic language philosophy, the English language is philosophically impoverished, lacking another word for knowledge that is found in all other European languages, one based on words whose root means "to have seen."

Justified True Belief
Nevertheless, the modern field of epistemology has generally defined knowledge in three parts as "justified true belief," specifically the truth of beliefs about statements or propositions. For example,
S knows that P if and only if

(i) S believes that P,
(ii) P is true, and
(iii) S is justified in believing that P.

In the long history of the problem of knowledge, all three of these knowledge or belief "conditions" have proven very difficult for epistemologists. Among the reasons...

(i) A belief is an internal mental state beyond the full comprehension of expert external observers. Even the subject herself has limited immediate access to all she knows or believes. On deeper reflection, or consulting external sources of knowledge, she might "change her mind."

(ii) The truth about any fact in the world is vulnerable to skeptical or sophistical attack. The concept of truth should be limited to uses within logical and mathematical systems of thought. Real world "truths" are always fallible and revisable in the light of new knowledge.

(iii) The notion of justification of a belief by providing reasons is vague, circular or an infinite regress. What reasons can be given that themselves do not have just reasons? In view of (i) and (ii) what value is there in a "justification" that is fallible, or worse false?

(iv) Epistemologists have primarily studied personal or subjective beliefs. Fearful of competition from empirical science and its method for establishing knowledge, they emphasize that justification must be based on reasons internally accessible to the subject.

(v) The emphasis on logic has led some epistemologists to claim that knowledge is closed under (strict or material) implication. This assumes that the process of ordinary knowing is informed by logic, in particular that

(Closure) If S knows that P, and P implies Q, then S knows that Q.
We can only say that S is in a position to deduce Q, if she is trained in logic.

It is no surprise that epistemologists have failed in every effort to put knowledge on a sound basis, let alone establish knowledge with apodeictic certainty, as Plato and Aristotle expected and René Descartes thought he had established beyond any reasonable doubt.

Perhaps overreacting to the threat from science as a demonstrably more successful method for establishing knowledge, epistemologists have hoped to differentiate and preserve their own philosophical approach. Some have held on to the goal of logical positivism (e.g., Russell, early Wittgenstein, and the Vienna Circle) that philosophical analysis would provide an a priori normative ground for merely empirical scientific knowledge.

Logical positivist arguments for the non-inferential self-validation of logical atomic perceptions like "red, here, now" have perhaps misled some epistemologists to think that personal perceptions can directly justify some "foundationalist" beliefs.

The philosophical method of linguistic analysis (inspired by the later Wittgenstein) has not achieved much more. It is unlikely that knowledge of any kind reduces simply to the careful conceptual analysis of sentences, statements, and propositions.

Information philosophy looks deeper than the surface ambiguities of language.


Information philosophy distinguishes at least three kinds of knowledge, each requiring its own special epistemological analysis:

  • Subjective or personal knowledge, including introspection and intuition, as well as communications with and perceptions of other persons ("other minds").
  • Communal or social knowledge of cultural creations, including fiction, myths, conventions, laws, history, etc.
  • Knowledge of a mind-independent physical external world.

This last kind of knowledge is based on the "scientific method," roughly defined as a combination of

  • Systematic observations of the external world.
  • Arbitrary, even random, hypotheses (theories) that might explain the observations.
  • Logical, rational deductions from the hypotheses that make (usually quantitative) predictions about further observations.
  • Experiments (measurements) that can be reproduced by other scientists in an open-minded community of inquirers to confirm (verify) or deny (falsify) those predictions, and thus, the theories.
  • A combination of the theories to reduce their number. Theories that grow to explain greater and greater numbers of predictions are considered closer to the "truth" about reality, and are often described as "laws of nature".
The totality of scientific knowledge gives us our most reliable "information" about the world. How exactly do we acquire and maintain this knowledge?

When information is stored in any structure, whether in the world, in human artifacts like books and the Internet, or in human minds, two fundamental physical processes occur. These are the two parts of the cosmic creative process.

First is a collapse of a quantum mechanical wave function that is needed to create even a single "bit" of new information in an experimental measurement.

Second is a local decrease in the entropy corresponding to the increase in information. Without this, the new bit would be erased and the system returned to equilibrium. Entropy greater than the increase in information (negative entropy) must be transferred away from the location of the new information to satisfy the second law of thermodynamics.

Leo Szilard calculated the mean value of the quantity of entropy produced by a 1-bit measurement as

S = k log 2,

where k is Boltzmann's constant. The base-2 logarithm reflects the binary decision. The amount of entropy generated by the measurement may, of course, always be greater than this fundamental amount, but not smaller, or the second law would be violated.

These quantum level processes are susceptible to noise. Information stored may have errors. When information is retrieved, it is again susceptible to noise, This may garble the information content. In information science, noise is generally the enemy of information. But some noise is the friend of freedom, since it is the source of novelty, of creativity and invention, and of variation in the biological gene pool.

Biological systems have maintained and increased their invariant information content over billions of generations. Humans increase our knowledge of the external world, despite logical, mathematical, and physical uncertainty or indeterminacy. Both do it in the face of random noise, bringing order (or cosmos) out of chaos. Both do it with sophisticated error detection and correction schemes that limit the effects of chance.

The scheme we use to correct human knowledge is science, a combination of freely invented theories and adequately determined experiments.

For Teachers
For Scholars

Part One - Introduction Part Three - Value
Normal | Teacher | Scholar