RE: [Fis] Is it Shannon entropy?

From: <[email protected]>
Date: Mon 07 Jun 2004 - 12:08:16 CEST

Michael quoted me writing:
> Aleks wrote "we cannot infer ever-increasing entropy for
> Shannon's probabilistic model of communication, because this
> model has nothing to do with thermodynamical models." I
> quoted Shannon previously in this forum, Aleks. Shannon wrote
> that "Quantities of the form H = Sum p log p play a central
> role in information theory as measures of information, choice
> and uncertainty. The form of H will be recognized as that of
> entropy as defined in certain formulations of statistical
> mechanics....
> H is then, for example, the H in Boltzmann's famous H
> theorem." (p. 393) Perhaps, we must all accept Shannon's own
> words as the authoritative and definitive resolution of this question.

I agree with your conclusion, but I note the precise words: "Quantities of
the *form*...the *form* of H...". The form, yes, but not the content. The
content is the underlying probabilistic model.

So Boltzmann's entropy may be Shannon's entropy, both in form and in
content, but Shannon's entropy shares with Boltzmann's entropy merely the
form. Shannon's entropy is a measure of *any* probabilistic model,
thermodynamic or not. I liked the readable elaboration on the difference at
http://www.panspermia.org/seconlaw.htm There, they distinguish logical
entropy from thermodynamic entropy.

However, I do agree with your own thesis, Michael, all information is
ultimately physical (given the present evidence, of course. I'm not a
dogmatic believer in any model). Our computers computing logical entropy
back and forth are physical devices, and the mental concept of abstract
entropy only exists as an arrangement of electrons. If we represent it with
a number, that number too will be an arrangement of electrons. So even if
Shannon's entropy may seem like a more general abstract concept, it is
itself, as a concrete element of the world, subject to thermodynamic
entropy; no matter what model we decide to use for thermodynamic entropy,
Boltzmann's or some other.

Best regards,
                Aleks

--
mag. Aleks Jakulin
http://www.ailab.si/aleks/
Artificial Intelligence Laboratory, 
Faculty of Computer and Information Science, University of Ljubljana.
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Mon Jun 7 12:11:50 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET