RE: [Fis] Shannon said Information is Physical Entropy

From: Loet Leydesdorff <[email protected]>
Date: Thu 17 Jun 2004 - 09:55:13 CEST

Dear Michael and colleagues,
 
The terminology "Shannon entropy" itself may be part of the confusion.
It is perhaps better to name this "probabilistic entropy" or
Shannon-type information. However it may be named, H remains to be
considered as the expected information content of a distribution.
 
The Shannon formulas get meaning when one specifies (theoretically) what
is the substance that is distributed (and redistributed when the
information is communicated). The mathematical theory of communication
can thus be paired with a substantive theory: what is communicated when
the system operates? One can expect analogies and differences among
entropical systems because the different types of systems share the
mathematical formalisms (including the Second Law), but they differ in
terms of the substantive theories. Thus, special theories of
communications can be developed at each systems level. It seems to me
that this is an extermely rich configuration.
 
I myself am most interested in systems which communicate meaning in
addition to information. This can be modeled as a system which
communicates using two channels for the communication at each moment of
time. The two channels are not hardwired (like in physical
transmissions) and therefore one expects interaction terms and feedbacks
(e.g. in human language). I learned from this discussion with you that
one should not call the mathematical apparatus that the theory of
communication provides "Shannon entropy" because of the historical
connotation--but I never did so. I am happy with the term probabilistic
entropy. But we should not confuse the historical origins of these
concepts with their epistemological status.
 
Why does the distinction between thermodynamic (physical) and
probabilistic entropy not solve the issue?
 
With kind regards,
 
 
Loet
  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/

 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society

> -----Original Message-----
> From: fis-bounces@listas.unizar.es
> [mailto:fis-bounces@listas.unizar.es] On Behalf Of Michael Devereux
> Sent: Thursday, June 17, 2004 6:26 AM
> To: FIS Mailing List
> Subject: [Fis] Shannon said Information is Physical Entropy
>
>
> Dear Werner, Loet and colleagues,
>
> I hope I don't seem merely argumentative. It's not my
> intention to keep
> saying the same thing over and over until no one else is willing to
> rebut my assertions. The ideas we're discussing are most
> significant to
> me because, among other things, they play a crucial role in
> calculations
> I'm completing on the entropy cost of information processing in
> Szilard's engine. I hope to be able to determine, definitively, the
> entropy cost of manipulating individual information bits.
>
> So, I'd like to sustain those claims I continue to believe are
> scientifically valid and truly important. I'm not looking for
> agreement
> with all my colleagues, but, rather, the good science of
> agreement with
> physical reality. And, I welcome the informed and considered
> criticism
> of everyone who disagrees.
>
> In that light, I think the most important question is whether Shannon
> entropy (as Shannon derived and understood it, not simply as
> that term
> may be used in each research investigation) is the same thing as
> physical (thermodynamic) entropy. I understand that many in
> this forum
> would answer this question with a no. Werner, you've written that
> Shannon entropy and Boltzmann's physical entropy are "not completely
> separated, but related."
>
> Shannon co-wrote a book with Warren Weaver a year after
> publication of
> his famous equation derivation. It's called "The Mathematical
> Theory of
> Communication" (University of Illinois Press, Chicago, 1949). Weaver
> writes that "The quantity which uniquely meets the natural
> requirements
> that one sets up for 'information' turns out to be exactly
> that which is
> known in thermodynamics as entropy....That information be measured by
> entropy, is, after all, natural when we remember that information, in
> communication theory, is associated with the amount of
> freedom of choice
> we have in constructing messages." (pp. 12-13). Weaver tells us that
> this information is modeled by Shannon's infamous equation, H
> = - Sum p
> log (p) (page 14).
>
> I previously cited Grandy's resource letter (Am. J. Phys. 65,
> 6, 1997,
> p. 466). He tells us that the "rigorous connection of
> information theory
> to physics" is due to Jaynes (Phys. Rev. A 106, 1957, p. 620)
> who showed
> that S (physical entropy) "measures the amount of information
> about the
> microstate conveyed by data on macroscopic thermodynamic variables,"
> where S is the "experimental entropy of Clausius. Quantum
> mechanically
> one employs the density matrix rho and von Neumann's form of the
> entropy" (p. 469).
>
> I've continued to maintain in this forum Landauer's contention that
> information is physical ( Landauer, Phys. Today, May 1991, Landauer,
> Phys. Lett. A, 217, 1996, p. 188. Sorry, Aleks, I had thought these
> publications were decades earlier.) Other authors have made the same
> argument. And, if it's true that information is physical,
> then Shannon,
> obviously, was deriving an equation for something physical. I've
> previously quoted Shannon here in this forum: "We wish to consider
> certain general problems involving communication systems. To
> do this it
> is first necessary to represent the various elements involved as
> mathematical entities, suitably idealized from their physical
> counterparts."
>
> I know it's a repeat of a repeat, but I think the distinction between
> the thing being described (the thing itself, whether the energy of a
> physical particle or, say, the idiosyncratic behavior of some
> species of
> bird, or Shannon entropy), and the mathematical model of that
> thing, is
> an essential distinction to maintain. As Michel has said, we
> all seem to
> be able to recognize that difference. Would you disagree, Werner? You
> write that Shannon entropy is truly pure mathematics, that "it is a
> mathematical expression which can be applied to many systems having
> probability distributions." (I emphasize the word "entropy",
> as opposed
> to "equation", or "mathematical expression".) Perhaps you
> don't permit a
> difference between Shannon's equation and Shannon entropy, Werner.
>
> As an analogy, energy is a measurable property of tangible
> objects, and
> is not the same thing as the mathematical formula which describes
> kinetic energy or electrical energy, etc. (We physicists cringe when
> "New-Age Astrologers", for example, predict the heightened energy of
> personal relationships that must result from the conjunction of Venus
> and Mars in Virgo. We don't permit the word energy with that
> meaning in
> science. I believe that restriction is entirely worthwhile within our
> own scientific disciplines because it fosters precise
> understanding of
> the concept, which can then be modeled mathematically.)
>
> I understand, Loet, that "Shannon's formula is a mathematical
> expression
> that is formally similar to the Boltzmann equation." But, as
> Shannon and
> Weaver, and Jaynes, and Landauer have shown us, Shannon
> entropy is the
> same physical thing as thermodynamic entropy. So, with Shannon's
> entropy, anyway, I don't agree that "the reference to a
> physical system
> is possible, but not necessary."
>
> And, if Shannon entropy is actually physical entropy then it
> must obey
> the Second Law of Thermodynamics. If it's not increasing
> monotonically
> with increasing N, it''s not really Shannon entropy. And if won't
> satisfy the monotonically increasing postulate that Shannon
> imposed on
> his derivation.
> .
> Is it perhaps dogmatic, Loet, to insist that Shannon entropy is what
> Shannon said it was? I believe restriction to that meaning avoids
> confusion in our research. To my mind, "Shannon entropy" is a
> technical
> term, just like energy, or cell mitosis. And we promote
> understanding,
> rather than confusion, by calling something Shannon entropy
> only if it
> means what Shannon meant. That was my point in arguing that a
> Devereux
> theory, or a Leydesdorff theory, ought to be what you or I
> have stated
> our own theory to be. My feeling is that we ought to label it with a
> different name if it isn't actually the author's meaning we intend by
> that name.
>
> You've written, Werner, that "applying Shannon's formula to physical
> systems not necessarily gives thermodynamic entropy". I think
> that can
> be a confusing problem if one labels the property of that physical
> system Shannon entropy. Simply employing Shannon's formula on
> the system
> does not guarantee that the property being modeled is Shannon
> entropy.
> That's one of the arguments I was trying to defend in recent
> postings.
> You said "there are applications to physical systems....which lead to
> some entropy but not to the measurable thermodynamic entropy." And I
> suggest, as scientists, we never call that thing Shannon entropy.
>
> I'm grateful for the comments which prompt me to reconsider, and,
> hopefully, further understand these important concepts.
>
> Cordially,
>
> Michael Devereux
>
>
>
> _______________________________________________
> fis mailing list
> [email protected] http://webmail.unizar.es/mailman/listinfo/fis
>
Received on Thu Jun 17 09:57:01 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:47 CET