[Fis] Physical Entropy

From: Michael Devereux <[email protected]>
Date: Tue 11 May 2004 - 07:42:15 CEST

Dear Michel, Loet, Guy, Bob, Stan, and colleagues,
Thanks for all the valuable comments. I expect I�ve not been
sufficiently precise about the equivalency I maintain exists between
thermodynamic entropy and Shannon�s (informational) entropy. I mean by
Shannon�s entropy just what Shannon meant in his original publication
(Bell Sys. Tech. J. 27, 3, 379-423, 1948), but nothing more than that.
He derived his formula, H = - Sum p log(p), as a measure of
�information, choice and uncertainty� (p. 393) in a system of electrical
pulses carrying information along a communications line such as a
telegraph cable. I�ve emphasized before that this is an application to a
physical (tangible, observable, real) system. And I find that Shannon
did actually name H entropy: �We shall call H = - Sum p log (p) the
entropy of the set of probabilities p1,....,pN� (p. 393)
I realize that the mathematical formula, H, is not itself the property
of any physical system, but it represents such a property: the amount of
information carried by electrical pulses along a communications line.
Shannon wrote, �We wish to consider certain general properties involving
communications systems. To do this it is first necessary to represent
the various elements involved as mathematical entities, suitably
idealized from their physical counterparts� (p. 381). I understand this
to be the standard procedure for science. And I�ll show, below, that the
information quantified by Shannon�s formula is the property of a
physical system that actually can be measured in the laboratory.
I agree with you Michel, that �nobody would confuse the system with one
of its mathematical models.� I suppose we all understand the distinction
between a physical object and it�s mathematical representation. I think
mathematics, in spite of its intimate connection to science, resides in
the realm of pure ideas, and not where measured, observable results may
be compared in the laboratory. I guess we could label that a part of
metaphysics, Guy, along with such things as morality, theology,
philosophy and such. I expect you�d agree with me that the validity of a
mathematical theorem isn�t decided by a scientific measurement. Nor is
the value of a poem. Or the difference between good and bad.
So, I�m sure, Loet, that physics can�t ultimately describe everything. I
expect you were thinking much the same thing, but suggesting that
physics does try to explain all of the natural world.
I believe I understand what you�re saying, Michel, that �I consider that
the system and its model are two different entities. It is why, in some
practical situations dealing with entropy, I consider that the
informational entropy is able to modelize the physical entry, but is not
the physical entropy.�
I agree, Michel, that the mathematical formula H, models a property of a
physical system, and is not, itself, that property. But, I�m convinced
the same can be said for every mathematical description used by science;
every formula, including, for example. Clausius version of thermodynamic
entropy, Delta Q / T. Clausius� formula is a mathematical idealization
that can be substantiated, for measured heat and temperature in the
laboratory, to within some observed precision.
The question, then, I hope all will agree, is not whether scientists use
mathematical models to idealize both thermodynamic entropy and the
amount of information that can be transmitted through a telegraph cable,
but whether the Shannon formula H describes (models, idealizes, etc.)
the identical physical property that is described by the formulas for
thermodynamic entropy.
For purposes of my argument, I mean by Shannon�s entropy, exclusively,
the information carried by some physical system described by a
probability distribution. My reading of his paper indicates that this is
what Shannon meant also.
I accept, Bob, that the H formulation may also depict a �statistical
entropy� which might both increase or decrease with time. But I�m sure
that the information carried by a bounded, isolated, physical system,
and described by the formula H, never increases. That�s one of Shannon�s
conclusions for electrical signals on a telegraph cable, also. If the
cable is noisy, some information is lost between the sender and
receiver, but there�s never more information delivered than was
originally sent.
I also accept, Loet, that your decomposition algorithm appropriately and
usefully employs the H formula. But, I think it�s applied to a different
system of reference, as you�ve written, and not to the amount of
information described by the probability distribution of a physical
system. As you say, the systems of reference determine the appropriate
level of theorizing.
We believe that thermodynamic entropy is physical, so I�ve emphasized
Landauer�s argument that all information is physical also. Notice that
Shannon�s derivation of H confirms Landauer�s conclusion, explicitly
treating information as a physical property of the distribution of
electrical signals in a cable.
The bandwidth of an electrical cable can be determined from the measured
physical characteristics of the cable. From the cable�s geometry,
conductivity, capacitance, inductance, and such, one can calculate the
bandwidth, which determines the maximum rate of information transmission
through the cable. If one attempts to send information at a higher rate
than this maximum (information being calculated from Shannon�s H formula
for the given distribution of electrical pulses sent), noise in the
cable diminishes the maximum rate of information received to this
bandwidth value. Thus, Shannon�s information is, indeed, measurable.
And Jaynes (Phys. Rev. 106, 620-630, 1957) has shown that the
thermodynamic entropy of a physical system, described by some
probability distribution, is just that amount of information needed to
remove the uncertainty from that distribution. Of course, information is
calculated from Shannon�s formula, and thermodynamic entropy from one of
the classical formulas of that discipline. Notably, Gibbs (Elementary
Principles in Statistical Mechanics, Ox Bow, Woodbridge, CT, USA, 1981,
first published in 1902) reached essentially the same result when he
minimized his �average index of probability of phase� subject to
constraints on average total energy.
So, that�s my argument, and I�m sticking to it. At least until someone
can point to the errors in it. Thanks so much for all the stimulating ideas.
Cordially,
Michael Devereux

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue May 11 07:43:26 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET