RE: [Fis] Shannon said Information is Physical Entropy

From: <[email protected]>
Date: Thu 17 Jun 2004 - 10:08:58 CEST

Michael,

I'll be brief.

1. I have seen no indication that Shannon would mean what you say he meant.
I'm not a theologist or a literary critic to be qualified to debate what
someone once meant.

2. Jaynes was fully aware that the [-p log p] expression is an
*approximation* to thermodynamic entropy, first through Stirling's
approximation, and second through Boltzmann's S <-> H connection. See the
`Monkeys, Kangaroos and N' paper, available online at
http://bayes.wustl.edu/etj/node1.html

3. Nobody is questioning the connection between S and H when we speak about
physical models. However, it seems that the consensus is that there can be
probabilistic models that are not physical.

4. I agree that information, ultimately, is physical. But *not* Shannon
information, as it is understood by the grand majority of practitioners.

What to do? Physicists often express the desire to distinguish Clausius
"empirical" entropy, Boltzmann "model" entropy which attempts to approximate
the empirical entropy with a probabilistic model, and Shannon "logical"
entropy based on a probabilistic model which doesn't concern itself with the
empirical entropy. Physicists also want to keep "entropy" solely for
themselves. Perhaps we should distinguish the three kinds of information
too?

Thereby, myself and Loet are employing "logical" information, you are
working with "model" information. But all of us are using probabilistic
models.

As for concluding thoughts:

I see symmetry as a particular way of measuring the uncertainty or
variation, which can be seen as an alternative to the multiplicity of ways
(Boltzmann's model) or the [-p log p] (Shannon's model). Effectively, the
greater the symmetry, the greater the entropy. For that reason I recommended
a more general concept of utility functions at the beginning of the
discussion. It remains to be seen what is the relationship between these
measures of uncertainty and the second law teleology, but the presence of
several possibilities implies that both Boltzmann's H and symmetry could be
seen as competing or complementary models of Clausius entropy.

Aleks

--
mag. Aleks Jakulin
http://www.ailab.si/aleks/
Artificial Intelligence Laboratory, 
Faculty of Computer and Information Science, University of Ljubljana.
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Thu Jun 17 10:12:41 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:47 CET