> But, I would maintain that if Shannon's own work
> concludes that some system characteristic isn't actually Shannon
> entropy, then it's not. Or, if it satisfies Shannon's
> criteria, then it must be Clausius' and Boltzmann's entropy,
> as well as Shannon's.
Dear Michael,
I fail to follow that Clausius' and Boltzmann's entropy (S) and
Shannon's H are identical if the relation between the two is:
S = k * N * H
The two would be identical only if k * N = 1.
However, I understand the difference between S and H if N is considered
as a measure for the size which transforms the probability density
function into a mass density function (as Aleks would express it), and k
is the Boltzmann constant which takes care of the dimensionalization.
These two factors thus provide the formula for S with a physical
interpretation. H itself is yet content-free.
This difference between S and H cannot be defined away. Furthermore, H's
mathematical character has consequences for using H in contexts other
than physics. Perhaps, I am confused, but then please explain how and
why you can state that S = H (as you seem to do in the above quotation).
With kind regards,
Loet
_____
Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
<mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/
<http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society
Received on Mon Jun 7 08:22:11 2004
This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET