Mikhail Volkenstein
(1912-1992)
Mikhail Vladimirovich Volkenstein was a leading Russian biophysicist. He was a specialist in the "organic chemistry" of macromolecules and other polymers. He founded the Leningrad school of polymer science.
His early work was on the spectroscopy of molecules, especially the vibrational band spectra of oscillating molecules (like
your Information Philosopher).
In the 1970's, Volkenstein started to explore the question of the
origin of life. He set out to investigate the complex connections between entropy, information, and life.
The Soviet Union allowed him to travel to Berlin to attend conferences on
irreversible processes and self-organization. He met there with
Ilya Prigogine, who won the Nobel Prize for his investigations into the irreversibility of processes in complex physical systems that are far from equilibrium conditions.
Prigogine developed non-equilibrium thermodynamics, particularly the theory of what he called "dissipative structures." These are physical or chemical systems in far from equilibrium" conditions that appear to develop "order out of chaos" and look to be "self-organizing." Like biological systems, matter and energy (of low entropy) flow through the "dissipative" structure. It is primarily the energy and negative entropy that is being "dissipated." Prigogine'a work on the mathematical non-linearity of non-equilibrium thermodynamics is the basis of much "chaos theory" and "complexity theory."
Volkenstein greatly admired the work of
Manfred Eigen, who developed the idea of "hypercycles," based on the well-known citric acid cycle, in which each step is catalyzed by an enzyme such that the cycle can be described as "autocatalytic," a popular idea among many theorists on the origin of life.
Surprisingly, many scientists, even specialists in thermodynamics and statistical mechanics are ambivalent about the concept of entropy and its significance. Not so surprisingly, generalists in physics, chemistry, and biology also are ambiguous about the role of entropy, and especially about the connection between entropy and so-called "negative entropy," which is intimately connected to
information.
Volkenstein's 1986 book,
Entropy and Information, would better have been entitled
Entropy, Information, and Life. It might have received more attention from biologists. But even this book cannot explain the deep and complicated connections between the
creation of information in the universe and physical entropy, let alone its dependence on
quantum mechanics.
Many philosophers of science are suspicious of
interpretations of quantum mechanics that imply the universe is
indeterministic, that
chance is real. A large majority of philosophers believe that human
free will is
compatible with physical
determinism. Like
Albert Einstein and many other great physicists, they hope that underlying "hidden variables" will be discovered that will restore classical deterministic laws of nature.
In his preface to his book, Volkenstein was very optimistic that the role of entropy in biology could be made very clear. He called it "not very complicated."
This is just...entropy,
he said, thinking that
this explained everything,
and he repeated the
strange word a few times.
Karel Capek, “Krakatit” 1
This “strange word” denotes one of the most basic quantities of the physics of heat phenomena, that is, of thermodynamics. Although the concept of entropy did indeed originate in thermodynamics, it later became clear that it was a more universal concept, of fundamental significance for chemistry and biology, as well as physics.
Although the concept of energy is usually considered more important and easier to grasp, it turns out, as we shall see, that the idea of entropy is just as substantial—and moreover not all that complicated. We can compute or measure the quantity of energy contained in this sheet of paper, and the same is true of its entropy. Furthermore, entropy has remarkable properties. Our galaxy, the solar system, and the biosphere all take their being from entropy, as a result of its transference to the surrounding medium. There is a surprising connection between entropy and information, that is, the total intelligence communicated by a message. All of this is expounded in the present book, thereby conveying information to the reader and decreasing his entropy; but it is up to the reader to decide how valuable this information might be.
The second half of the 20th century is notable for the creation and development of complex areas of science of the greatest importance not only for the natural sciences and technology, but also for the humanities. Such are cybernetics, information theory, and synergetics. Although these terms did not exist fifty years ago 2, they now turn up constantly. In all three of these disciplines the concepts of entropy and information are absolutely indispensable, so that without them it is not possible to grasp the true essence of modern science. The final chapters of the
book contain brief, and of necessity incomplete, expositions of synergetics and information theory. The aim of the present account is to bring these new disciplines to the reader’s attention, and introduce him or her to the circle of related ideas.
1. Karel Capek (1890-1938), Czech playwright and novelist. Inventor of the word “robot” in its present sense, in his play RUR.
2. This was written in 1986, so information theory and cybernetics originated around the time of World War II.
Entropy and Information, pp.1-2
References
Entropy and Information
Normal |
Teacher |
Scholar