[Fis] Is Shannon's Entropy of General Applicability?

From: Michael Devereux <[email protected]>
Date: Tue 20 Apr 2004 - 08:20:20 CEST

Dear Michel and colleagues,

Thanks for your comments. I think I may understand, as Loet wrote, that
it doesn�t make much sense to discuss exchanges of words, like truth or
power, in terms of the thermodynamics of these exchanges. And certainly
not, if one considers thermodynamics, just as its name implies, to be
all about heat flow.
The idea that impressed me most, while studying thermodynamics as a
student, was disorder, its relationship to the possible unitary
direction of evolution of the natural world, and characterization of the
term entropy as a measure of disorder. That characterization is usual in
elementary physics textbooks on thermodynamics. Of course, one also
reads that entropy may be determined, in certain situations, by
Clausius� formulation Delta Q / T.
There does seem to be a marked distinction between the use of the
entropy concept to engineer heat engines, or refrigerators, or even,
say, to describe the evolution of a star; and its use to predict the
correspondence between the pattern of dots and dashes sent by a
telegraph operator, with the noisy, deteriorated pattern received at the
other end of the telegraph wire. Unless, that is, one considers all
these phenomena within the context of disorder or uncertainty.
I agree with you, Michel, about some of your remarks, and disagree about
others. I don�t agree that �Shannon�s formula is derived from
probability theory, outside any physical field of application....�
Shannon was considering an application to discrete communication systems
and the correlation (largely positive) between the binary signal of dots
and dashes sent by a telegraph operator and those dots and dashes
received on the other end of the line. ( Shannon, Bell Sys. Tech. J. 27,
3, 1948, p. 382 ff.) He derived his famous mathematical formula for that
particular physical application.
You also wrote that �Shannon�s entropy does not generalize
thermodynamical entropy. They may be related in a limited number of
situations. One is of mathematical nature, the other is of physical
nature.� By Shannon�s entropy we mean the uncertainty of some system
which can be described (with some degree of accuracy) by the formula H =
- Sum p log(p).
I�m sure you�re aware that this mathematical formula, H, can be traced
back, historically, at least as far as Boltzmann�s renowned H theorem.
That�s during the nineteenth century, likely before Shannon was born.
Unquestionably, I would say, Boltzmann was using H to explain, and to
generalize, thermodynamic entropy. There was, after all, no information
theory then. We chemists, engineers and physicists claimed ownership of
the H formulation long before anyone else.
Of course science incorporates mathematics, as well as observation, into
it�s methods of inquiry. H is a mathematical formula, of a mathematical
nature, as you say, and it was used by Boltzmann to describe the entropy
(a state function) of physical systems, such as heat engines and other
tangible material devices.
Shannon, himself, acknowledged that the formula, when he derived it for
discrete communication systems, was identical with that entropy formula
already in use by the physical sciences: �The form of H will be
recognized as that of entropy as defined in certain formulations of
statistical mechanics..... H is then, for example, the H in Boltzmann�s
famous H theorem.� (Shannon, Bell Sys. Tech. J., 27, 3, 1948, p. 393)
I�d like to reiterate the entirely general scope of the H formulation to
ALL natural phenomena. We often call it the von Neumann-Shannon formula
because it was again derived, in 1932, sixteen years before Shannon�s
publication, by von Neumann. (Mathematische Grundlagen der
Quantenmechanik).
Von Neumann�s formula describes the connection between the mathematical
model of quantum mechanics and what humans observe of the physical
world. That�s all of the physical world, as far as one can determine,
because quantum mechanics seems to describe everything. And the
quantum-mechanical explanation is entirely probabilistic. That�s all
there is, as Feynman said. So, von Neumann�s formula correlates the
probable result for an ensemble of measurements, with the wave function
that models a physical system.
I agree with you, Michel, that the original thermodynamic definition of
entropy was introduced, historically, in terms of heat, and not
probability. But the founders of thermodynamics, including Boltzmann and
Gibbs, recognized, almost immediately, that the more general physical
principle was probabilistic. Liouville�s theorem, as applied to
Boltzmann�s phase-space conception for entropy, is a theorem of
probabilities.
You ask, Michel, �In science, is there any example of calculation
without input data? So, the precision of the model is judged from the
departure of predicted from observed, the predicted being also depending
on the precision of the input data, which are measured data.?� I know of
no example where observed values can be calculated without input
parameters from observational data, or perhaps, a guess at those input
parameters.
But, not all mathematical models incorporated into established
scientific theory, or scientific law, are judged empirically. I realize
that science is unique, among modes of inquiry, in its dependence on
observation. But, it also depends upon mathematics, whose truths, I take
it, are what Immanuel Kant called apodictic. For example, 1 + 1 = 2, not
because it says so in an arithmetic primer, but, rather, because our
mind declares those concepts must be true. No experiment need be performed.
We only know that Newton�s law, F = m a, describes nature quite
precisely because such has been observed. Likewise, for Einstein�s
famous equation, E = m c^2, and nearly all the other established laws of
science. But, not all. For example, we call it a law of hydrodynamics
(fluid dynamics) that flux is conserved; known also as the equation of
continuity. This law states that over some closed surface, the rate of
flow of a fluid in or out of that surface must equal the rate at which
fluid is created or destroyed within the volume enclosed by that
surface. The equation of continuity is pure reason, and no observations
need confirm it.
Another example is Gauss� law for electrodynamics. We say that if a
static electric field varies as the inverse square of distance from its
source, then the electric field flux (integral of electric field vector
times surface area element) over any closed surface is equal to the
enclosed electric charge divided by the physical constant epsilon.
Gauss� law requires no experimental confirmation. It�s exact.
I understand that these laws are just mathematics borrowed by science.
But, that�s how science works. Science�s methods are not exclusively
observational.
If you think I�m hinting that the Second Law ot Thermodynamics (which
describes how the entropy of a physical system can change) may depend on
the mathematics of probability, you�re right.
Thanks to everyone for the comments.
Cordially,
Michael Devereux dbar_x@cybermesa.com

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 20 08:22:01 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET