RE: [Fis] Only One Entropy

From: Loet Leydesdorff <[email protected]>
Date: Mon 03 May 2004 - 08:39:19 CEST

Dear Michael,
 
Thus, you provoke me to burn up one of my two options on Monday!
 
> According to V. J. Emery, author of the article Quantum
> Statistical Mechanics in the Encyclopedia of Physics, The central
> concept in dealing with incomplete information is the microscopic
> definition of entropy or uncertainty introduced by von Neumann and by
> Shannon: S = -k Sum[p-sub-alpha log(p-sub-alpha)]....Notice
> that in all
> of these summations over alpha, every member of a set of degenerate
> levels is to be included. (page 994) I followed Emerys rule in my
> calculation. Does anyone dispute his prescription? What about
> you, Loet? Id also like to reiterate, and, hopefully,
> reinforce my previous
> arguments that there is no difference between informational
> entropy and
> the entropy used by scientists to describe any other physical system.

I agree with S = k * H . (Actually, I gave the derivation in a previous
email.)
 
S is the thermodynamic entropy and H the probabilistic (Shannon)
entropy. But I don't follow your conclusion that there is consequently
only one entropy. k is a constant, but it is not dimensionless. The
dimensionality of k (Joule/Kelvin) provides physical meaning to the
statistical apparatus of H. H is a generalized measure for the
dividedness, while S is a physical entity.
 
For example, in statistical decomposition analysis one can use H as a
measure for the segregation in schools, classes, neighbourhoods, etc.
The decomposition algorithm can be derived from the "microscopic"
formula as:
 
H = H(o) + Sigma(g) P(g) H(g)
 
H(o) is the in-between group uncertainty, while H(g) is the uncertainty
which prevails in each of the compartments. P(g) weights these
uncertainties. The relative weight of the in-between group uncertainty
(H(o)) can teach us something about the dynamics of the system. (Is it
an aggregate of lower-level groupings or are interaction terms
important?) This mathematical apparatus applied to the segregation of
social systems has no longer a physical interpretation in terms of S
because this H cannot be multiplied with k. The system of reference is
different.
 
Each system of reference provides us with another substance that is
distributed. In physics the momenta and energies are distributed and
communicated (either conservative or dissipative). In other systems
something else is distributed. The specification of this hypothesis
spans a specific system of communication. For example, when atoms are
distributed and redistributed a chemistry is generated. When molecules
are distributed and redistributed one deals with a biology. My interest
is particularly in systems which distribute and redistribute symbolic
media of communication (meaning). These systems cannot be easily
observed, but we can measure the probabilistic entropy when such systems
operate.
 
For example, money can be considered as a symbolically generalized
medium of communication. The flow of money in an economy can be measured
using entropy measures. In this case, one can also observe the
communication, but in other cases the (geometrical) observation is
sometimes more difficult than the (algorithmic) measurement. For
example, at the Internet one can measure the flows of communication in
terms of their expected information contents.
 
In summary, I seem not to agree with Emery despite the status of this
author in physics. The argument is not convincing.
 
With kind regards,
 
 
Loet
  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/

 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society
 
> Though youve said thermodynamic entropy differs from informational
> entropy, Michel, a review of the literature seems to indicate
> this is no
> longer a distinction supported by scientific experts in this
> field. Ill cite below some references to recent scientific
> work that, I think,
> make the argument decisively. Ive not been able to find any
> contemporary research from the literature which disputes the
> idea that
> informational entropy is actually physical (thermodynamic)
> entropy. Have
> I failed to find those results somewhere?
> First, Id point to the really concise, copiously annotated,
> recent and
> decisive account given of the relationship between entropy and
> information written by W. T. Grandy, Jr. in Am. J. Phys. 65, 6, June
> 1997, p. 466. This is a Resource Letter designed to provide
> scientists
> a very succinct, and comprehensive description of the history,
> development, and scientific results about a particular
> subject. Grandy
> lists more than 160 articles and books reaching as far back as
> Boltzmanns seminal paper of 1877. Its a great source for the entire
> history of the development of information theory and its
> relationship to
> entropy.
> Grandy says the principle rigorous connection of information
> theory to
> physics (p. 468) is based on whats known as the principle
> of maximum
> entropy (PME), developed by Edwin Jaynes shortly after
> Shannon did his
> famous work. Grandy writes that one can now safely relate the
> theoretical (maximum) entropy to the fundamental entropy of Clausius.
> Quantum mechanically, one employs the density matrix, rho, and von
> Neumanns form of the entropy... (p. 469). Grandy doesnt equivocate
> about the equality of all these forms of entropy, does he?
> I was first convinced that information is tangible, so that
> informational entropy is actually physical entropy, when I
> read one of
> Rolf Landauers papers from 1991. I expect that many of us
> know Landauer
> as the chief scientist at the IBM Watson laboratory in New
> York for more
> than twenty years, and source of Landauers Principle which may
> describe the entropy cost of erasing one information bit.
> He wrote a short and relatively simple article titled Information Is
> Physical (Phys. Today, May, 1991, p.23) that argues,
> strangely enough,
> that information is actually physical. I still accept his
> discussion and
> conclusion. Does anyone dispute Landauers arguments? If not,
> I believe
> we must accept that the entropy of information is actually
> the entropy
> of a physical system. The same entropy we calculate for the operating
> fluid in an engine, for example.
> Shu-Kun, are you willing to weigh in with your view on this
> topic? Ive
> been reading some of your articles on the similarity principle, and
> youve written that entropy in thermodynamics is a special kind of
> dynamic entropy. Do you believe there is a difference between
> thermodynamic and informational entropy?
> Thanks, again, for this opportunity to express my understanding about
> these ideas, and to hear others views.
> Cordially,
> Michael Devereux
>
>
> _______________________________________________
> fis mailing list
> [email protected] http://webmail.unizar.es/mailman/listinfo/fis
>
Received on Tue May 4 10:07:50 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET