Re: [Fis] Is it Shannon entropy?

From: Prof.Dr.Werner Ebeling <[email protected]>
Date: Tue 08 Jun 2004 - 12:12:36 CEST

Dear FISers,
just a little comment form the point of view of a physicist, who is working
with physical entropy for about 40 years. This comment is
in order to avoid further confusion.
Physical entropy and Shannon entropy (or information) are closely related
but they are identical only in special cases, namely
just if the space (the probabilities) to which Shannons formula is applied,
is the phase space of physics (or the state space in quantum
mechanics). The factor N is irrelevant it just says whether
we like to define entropy per particle of entropy of the system
of N particles. The factor k_B gives us just the unity. However this is
a very important constant of nature connecting two branches of science
probabilities for microscopic states and macroscopic measurable quantities.

Essential is the specification of the space (the probabilities). Physical
entropy is the uncertainty of identifying the physical state in state space
(phase space or in quantum eigen states).
Shannons entropy concept is in fact more general, it can be applied to
any set of probabilities. Applied to the appropriate physical
probabilities, it gives a measurable quantity, namely physical entropy.
There is no mystery at all. Physical entropy ca be measured like mass
or energy.

Werner Ebeling

Kust a On Monday 07 Jun 2004 8:20 am, you wrote:
> > But, I would maintain that if Shannon's own work
> > concludes that some system characteristic isn't actually Shannon
> > entropy, then it's not. Or, if it satisfies Shannon's
> > criteria, then it must be Clausius' and Boltzmann's entropy,
> > as well as Shannon's.
>
> Dear Michael,
>
> I fail to follow that Clausius' and Boltzmann's entropy (S) and
> Shannon's H are identical if the relation between the two is:
>
> S = k * N * H
>
> The two would be identical only if k * N = 1.
>
> However, I understand the difference between S and H if N is considered
> as a measure for the size which transforms the probability density
> function into a mass density function (as Aleks would express it), and k
> is the Boltzmann constant which takes care of the dimensionalization.
> These two factors thus provide the formula for S with a physical
> interpretation. H itself is yet content-free.
>
> This difference between S and H cannot be defined away. Furthermore, H's
> mathematical character has consequences for using H in contexts other
> than physics. Perhaps, I am confused, but then please explain how and
> why you can state that S = H (as you seem to do in the above quotation).
>
> With kind regards,
>
> Loet
>
> _____
>
> Loet Leydesdorff
> Amsterdam School of Communications Research (ASCoR)
> Kloveniersburgwal 48, 1012 CX Amsterdam
> Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
> <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
> <http://www.leydesdorff.net/> http://www.leydesdorff.net/
>
> <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
> Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
> Self-Organization of the Knowledge-Based Society

-- 
-----------------------------------------------------------
Prof. W. Ebeling                ebeling@physik.hu-berlin.de
Humboldt-Universitaet Berlin    phone:  +49/(0)30-2093 7636
Institut fuer Physik            fax:    +49/(0)30-2093 7638
Invalidenstrasse 110
D-10115 Berlin                                                           
-----------------------------------------------------------
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Jun 8 12:14:13 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET