[Fis] Physical Information is Shannon Entropy, Part V.

From: Michael Devereux <[email protected]>
Date: Mon 21 Jun 2004 - 00:34:38 CEST

Dear Loet, Aleks and colleagues,

Science is a very limited method of trying to understand anything. We
scientists deal exclusively with those physical things that are
observable and measurable. We only wish, Loet, that we could
scientifically evaluate the meaning conveyed by different forms of
communication. And, I think the most important issues we humans would
like to understand are entirely beyond the scope of science.

But science is uniquely precise and definitive about its laws and
conclusions because its subject matter and methods are so carefully
limited. One reason science is progressive, and need not continually
reevaluate Newton�s Laws or Einstein�s Special Theory (within the range
of their applicability), for example, is the exact, observable
definitions we give for those properties we investigate.

Notice that if it�s not physical, observable entropy, then that thing is
not a subject for science. I don�t know any of the physicists, Aleks,
who you say wish to distinguish Clausius' empirical entropy, Boltzman�s
model entropy, and the Shannon logical entropy �which doesn�t concern
itself with the empirical entropy.� If these are not all the same
tangible, observable entropy, then physicists don�t include it in our
theories and measurements.

When scientists talk about energy, for example, they mean precisely that
thing which is measured so carefully by the Bureau of Standards.
Likewise, for the physical entropy that we scientists evaluate, and
which is the subject of the Second Law of Thermodynamics. I think it�s
true, Aleks, that we scientists would like to keep the term entropy (and
energy, and mass, and force, etc.) just for ourselves, so that it only
means something physical and measurable. But, that won�t happen, and
hasn�t, in fact, happened. However, as scientists, we must always
remember the distinction.

You write, Aleks, �that information, ultimately is physical. But not
Shannon information, as it is understood by the grand majority of
practitioners.� I certainly agree that information is physical, and so a
fit subject for science. And it may be perfectly correct that most
researchers mean something non-physical when they say Shannon
information. My continued argument is that we scientists mean only
observable, physical information when we�re analyzing information
scientifically (or we�ve forgotten the distinction). And that Shannon
obviously meant physical information also, since that�s exactly what he
wrote.

Aleks, if �nobody is questioning the connection between S and H when we
speak about physical models�, then the question must be whether
Shannon�s entropy, H, depicts a physical model. If information is
physical, as you say, then obviously, Shannon was describing a physical
thing, and Shannon�s entropy, H, is the physical entropy, S.

Here�s why it matters to scientists whether the Shannon information
described by Shannon (and by Shannon and Weaver) is physical
information. It�s not just a choice of words to describe physical
information, but we also recognize that Shannon�s subject, his
derivation, and his equation all described a physical system. So, we may
properly use his model, his methods and his equation to describe our own
physical systems.

And, that equation is, indeed, appropriate in scientific work.
Scientists have routinely and continuously used Shannon�s formula, and
von Neumann�s formula and Boltzmann�s formula before that, to describe
physical systems. What may be called a probabilistic equation, describes
a physical system, or else Boltsmann, von Neumann, and every scientist
in between, was mistaken in so applying it. (They weren�t.)

I think Jaynes greatest contribution to a real understanding of the
connection between information and physical entropy is his Principle of
Maximum Entropy (reviewed by Grandy in his resource letter). In the
article you cite on image reconstruction, Jaynes defines H to be �the
maximum entropy per dot, H = log(W)/N....(which) goes asymptotically
into the Shannon entropy -Sum (N-sub-i/N)log(N-sub-i/N).� It�s
reasonable, in this context, to call H the maximum entropy because it
becomes Shannon�s entropy in the limit. Boltzmann�s H is not the maximum
entropy, in general, but rather, the system entropy which becomes
maximum at equilibrium. (That�s what Boltzmann�s H theorem tells us.)
Thus, the Sum P log (P) expression you mention, is not an approximation
to thermodynamic entropy, but is actually thermodynamic entropy itself.
That is Boltzmann�s H (and Shannon�s H), and both are approximated (in
the limit) by the symbol H chosen by Jaynes here.

The accepted distinction between physical entropy and probabilistic
entropy would solve the problem, I believe, Loet, if we also recognize
that Shannon�s equation (sometimes called probabilistic) is entirely
appropriate for physical entropy as well as for the other analyses to
which non-scientists usefully apply it. (These other analyses, I assume,
Aleks, are the non-physical probabilistic models to which you refer.)

When we scientists use Shannon information in our research, Aleks, it
means, as Shannon and Weaver wrote, �the quantity which uniquely meets
the natural requirements that one sets up for �information� (and) turns
out to be exactly that which is known in thermodynamics as entropy.�
Though I can�t point to an example now, I�m sure, of course, that some
scientist, at some time, has forgotten (or ignored) Shannon�s
publication, and has misapplied the term �Shannon entropy�.

You write Aleks, that you�ve �seen no indication that Shannon would mean
what� I�ve said he means. Have you read the book by Shannon and Weaver?
They were scientists writing about a technical subject for an audience
of other sophisticated researchers. Authors, in such cases, try to be
precise and clear about what they mean. And, the publication, of course,
is designed to provide a permanent record of the author�s ideas.

The english language has not changed so much in fifty-five years that
one can reasonably misinterpret the lucid statement �the quantity which
uniquely meets the natural requirements that one sets up for
�information� turns out to be exactly that which is known in
thermodynamics as entropy.� And, I continue to mean that Shannon�s
information is the same thing as thermodynamic entropy. One doesn�t have
to be a �theologist or literary critic� to recognize that Shannon and
Weaver meant the same thing.

My thanks to the FIS forum for this opportunity to defend these
important ideas.

Cordially,

Michael Devereux

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Mon Jun 21 00:36:23 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:47 CET