RE: [Fis] Re: miscellanea / temperature / symmetry

From: Stanley N. Salthe <[email protected]>
Date: Sat 24 Apr 2004 - 22:47:55 CEST

Reacting to Loet, who said:

>Probablistic entropy can be considered as a measure of the uncertainty in
>a distribution. Which label we attach to this uncertainty depends on the
>theoretical perspective that we use. For example, from an evolutionary
>perspective we can call it variation as opposed to selection. From a
>dynamic perspective change versus stability. Selection is taking place at
>each moment in time; stabilization can only be evaluated over a time axis.
>
>Shannon's H --I am almost sorry for using it-- was defined for the
>measurement of the entropy at a specific moment in time. One can further
>derive from it a measure which is sometimes indicated as I (e.g., Theil
>1972) which measures the "dissipation":
>
>I = Sigma q(i) 2log q(i)/p(i)
>
>In this formula Sigma q(i) represents the a posteriori probability
>distribution and Sigma p(i) the a priori one. I then measures the change
>in terms of bits of information. It can be shown that I >= 0. This
>accords also with the second law in thermodynamic entropy, but it can be
>considered as a probabilistic (formal) equivalent.

      SS: Is "/pi" really '/ Sigma pi'? Then this formula would be a kind
of 'efficiency'! In any case, phenomenologically, I must always increase
in expanding or growing systems, or in the environment of a system learning
about its environment. Isn't there some constraint on the application of
this formula?

STAN

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Sat Apr 24 21:49:48 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET