[Fis] Only One Entropy

From: Michael Devereux <[email protected]>
Date: Mon 03 May 2004 - 07:53:01 CEST

Dear Michel and Colleagues,
I was very interested, several weeks ago, in calculating the entropy
associated with a certain distribution of energies among ideal gas
molecules. Some of those energies are degenerate, and it seemed, at that
time, that I must treat degenerate states as a single physical state,
with a single probability for all of them. You seem to be saying the
same, Michel. You wrote that For these latter, logical symmetries
(degeneracy of states), and spatial symmetry (chambers) lead to decrease
the combinatorial enumeration of the microstates, and thus there is an
effect on the calculation of thermodynamical entropy.
But, research at our library, when I did my calculation, indicated
otherwise. According to V. J. Emery, author of the article Quantum
Statistical Mechanics in the Encyclopedia of Physics, The central
concept in dealing with incomplete information is the microscopic
definition of entropy or uncertainty introduced by von Neumann and by
Shannon: S = -k Sum[p-sub-alpha log(p-sub-alpha)]....Notice that in all
of these summations over alpha, every member of a set of degenerate
levels is to be included. (page 994) I followed Emerys rule in my
calculation. Does anyone dispute his prescription? What about you, Loet?
Id also like to reiterate, and, hopefully, reinforce my previous
arguments that there is no difference between informational entropy and
the entropy used by scientists to describe any other physical system.
Though youve said thermodynamic entropy differs from informational
entropy, Michel, a review of the literature seems to indicate this is no
longer a distinction supported by scientific experts in this field.
Ill cite below some references to recent scientific work that, I think,
make the argument decisively. Ive not been able to find any
contemporary research from the literature which disputes the idea that
informational entropy is actually physical (thermodynamic) entropy. Have
I failed to find those results somewhere?
First, Id point to the really concise, copiously annotated, recent and
decisive account given of the relationship between entropy and
information written by W. T. Grandy, Jr. in Am. J. Phys. 65, 6, June
1997, p. 466. This is a Resource Letter designed to provide scientists
a very succinct, and comprehensive description of the history,
development, and scientific results about a particular subject. Grandy
lists more than 160 articles and books reaching as far back as
Boltzmanns seminal paper of 1877. Its a great source for the entire
history of the development of information theory and its relationship to
entropy.
Grandy says the principle rigorous connection of information theory to
physics (p. 468) is based on whats known as the principle of maximum
entropy (PME), developed by Edwin Jaynes shortly after Shannon did his
famous work. Grandy writes that one can now safely relate the
theoretical (maximum) entropy to the fundamental entropy of Clausius.
Quantum mechanically, one employs the density matrix, rho, and von
Neumanns form of the entropy... (p. 469). Grandy doesnt equivocate
about the equality of all these forms of entropy, does he?
I was first convinced that information is tangible, so that
informational entropy is actually physical entropy, when I read one of
Rolf Landauers papers from 1991. I expect that many of us know Landauer
as the chief scientist at the IBM Watson laboratory in New York for more
than twenty years, and source of Landauers Principle which may
describe the entropy cost of erasing one information bit.
He wrote a short and relatively simple article titled Information Is
Physical (Phys. Today, May, 1991, p.23) that argues, strangely enough,
that information is actually physical. I still accept his discussion and
conclusion. Does anyone dispute Landauers arguments? If not, I believe
we must accept that the entropy of information is actually the entropy
of a physical system. The same entropy we calculate for the operating
fluid in an engine, for example.
Shu-Kun, are you willing to weigh in with your view on this topic? Ive
been reading some of your articles on the similarity principle, and
youve written that entropy in thermodynamics is a special kind of
dynamic entropy. Do you believe there is a difference between
thermodynamic and informational entropy?
Thanks, again, for this opportunity to express my understanding about
these ideas, and to hear others views.
Cordially,
Michael Devereux

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Mon May 3 07:55:42 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET