[Fis] Shannon Entropy Redux

From: Michael Devereux <[email protected]>
Date: Thu 10 Jun 2004 - 08:34:09 CEST

Dear Loet, Aleks and colleagues,

It seems to me Loet, that you are reinforcing my contention that the
Shannon information is only one of many meanings of the term information
in common usage among researchers. You recently wrote that we ought to
�distinguish sharply between Shannon-type information which remains
content-free (bits of information) and meaningful information which is
the result of an interaction between the Shannon-type information and
the system which provides the information with meaning.� I might even
accuse you of a hint of reductionism in this argument, which I think is
the correct one.

I believe I understand better now what you mean by content-free
information. I�ve continued to maintain that Shannon had something
specific in mind by the information he modeled mathematically. One might
interpret the Shannon information as carrying the context, or �content�
of uncertainty in the probability distribution of a physical system, the
same thing Boltzmann described. (Really, the information erases the
uncertainty, or entropy, in that distribution, as Jaynes stipulated.)
But I prefer your idea for content-free information as just the physical
information bits. (I wish, Aleks, that exclusively physical information
were my idea. It was Shannon�s idea way back in the early sixties.)

So, it�s not just the mathematical form of Shannon�s equation (von
Neumann�s equation, Boltzmann�s equation) which specifies what we mean
by Shannon entropy. An equation of that form might profitably be applied
in innumerable unrelated situations, and seems, in fact, to have been
done so.

By Shannon entropy we must mean what Shannon meant. (I take it, Loet,
you wouldn�t want someone to name his suppositions Leyersdorf�s theory,
unless it actually agreed with your own work. The words we choose really
do matter, because they carry meaning and content, as you�ve explained.
If we call it Shannon entropy then it ought to be exactly what Shannon
meant by his entropy.)

So, Shannon entropy is the uncertainty in the distribution of a physical
system, just what von Neumann, and Boltzmann before him, were also
modeling with that same equation. As distinct from incorrectly calling
something Shannon entropy just because an equation of that form is used
to analyze some system. If the uncertainty in the distribution doesn�t
increase monotonically, it cannot be Shannon entropy that�s being
described, for example.

Shannon derived the form of his equation uniquely from the three
postulates he lists in his article. We all understand, I suppose, that
the uncertainty in the distribution of a physical system (I emphasize
here, a PHYSICAL system) must increase monotonically with increasing N.
That postulate is not just some arbitrary mathematical axiom. Instead,
Shannon understood that it stipulates one of the essential traits of a
PHYSICAL system with many freedom degrees. And, Shannon entropy is not a
measure of any probabilistic model, thermodynamic or not, Aleks.
Shannon�s postulate of monotonic increase depicts the probability
distribution of a physical system.

We know that Boltzmann also recognized exactly that characteristic. If
the phase space of some physical system increases, the entropy of that
system will increase too. It must do so as the number of freedom degrees
provided by the expanding phase space grows. So Boltzmann taught us.

I suppose, Loet, that if one were able to reduce meaningful information
to it�s underlying Shannon information, as impractical and formidable as
that task might be, that Shannon information would be found to decrease
monotonically.

And, I understand your skepticism, Loet, about my identification of
Shannon entropy with Boltzmann�s entropy, since the form of the two
equations can differ by a constant. The author of your citation at
www.panspermia.org makes the same argument, Aleks. This is the constant
Shannon labeled lambda, such that his equation uniquely describes the
probability distribution of a physical system (electrical signals in a
wire), to within that constant.

Shannon was dismissive about the importance of this constant. �The
constant K (or lambda) merely amounts to a choice of a unit of measure,�
he wrote. Recall that the Shannon equation depicts the identical
physical property described by von Neumann and Boltzmann, so long as it
has the form of the Shannon equation to within a constant. That�s why I
said Shannon entropy is identical with Boltzmann�s entropy, Loet. Not
because the equations are identical in form, Aleks. But because the
physical property they model is exactly the same thing.

For such a description of a physical phenomenon, one may arbitrarily
select units of measure. That�s what Boltzmann did with his constant k,
also. Boltzmann�s entropy equation portrays a �pure� probability
distribution, too, just like Shannon�s equation; until the constant k is
appended.

I realize that some of us here think of information in terms of the
mathematical form of the Shannon equation, and thermodynamic entropy as
a function of the heat and temperature of a physical object. So, I�d
like to recommend again, if it doesn�t appear too self-serving,
Szilard�s model engine. Seventy-five years ago Szilard made the direct
connection between information and the Clausius entropy of a heat
engine. His engine can only convert heat to work if the appropriate
measurement produces operative information. I suppose that�s why I�ve
been so sure, all along, that Shannon entropy means thermodynamic entropy.

The Szilard engine is conceptually extremely simple. And I�ve found it
extraordinarily effective for understanding the connection between
measurement, information and thermodynamics. There is some quantum
mechanics formalism required for a proper analysis of the engine, which
may explain why Szilard�s proposed analysis was mistaken back in 1929.
But for those interested in the powerful concepts explicated by his
model, the quantum mechanics is ignorable.

I appreciate this forum and the interesting comments that have prompted
me to refine my ideas and understanding.

Cordially,

Michael Devereux

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Thu Jun 10 08:38:06 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET