RE: [Fis] CONCLUDING THE SESSION

From: Loet Leydesdorff <loet@leydesdorff.net>
Date: Mon 13 Oct 2003 - 20:25:15 CEST

Dear John and colleagues,

This seems very advanced. It is not easy.

But let me try to recapitulate in terminology that I (as a chemist) can
follow more easily.

        ÄG = ÄH - T ÄS

If I correctly understand you, ÄS is only a dimensionless number in this
formula because T is already an energy.
I had never understood this before.

I know that Brillouin and Szilard have coupled the Shannon entropy with
the thermodynamic entropy (via the Boltzmann constant (ÄS = kB ÄH)), but
I doubt that one has to follow this physical interpretation of the
Shannon formula because one can also provide Shannon's formula with only
a mathematical interpretation about the distributedness of any system of
reference. Entropy calculus is then used as a statistics. The expected
information content has only meaning within the model of the system
under study. (Physics is then a special case.)

One can easily see the potential independence of the Shannon and the
thermodynamic entropy. For example, when billiard balls collide in an
ideal case, the thermodynamic entropy is zero (because of the ideal
case), but the redistribution of energy and momenta is maximal and thus
a pronounced probabilistic entropy is generated in these two dimensions.
If the two quantities (ÄS and ÄH) were fully related with only the
Boltmann constant in the formula, the probabilistic entropy should also
be zero in the case of this collision without friction. The system of
reference for the physical collision and consequently the thermodynamic
entropy, however, is different from the system of reference of the
redistribution of momenta and energy. The latter is an
information-theoretical description of the system and not the physical
system itself.

With kind regards,

Loet

  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/

 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society

> -----Original Message-----
> From: John Collier [mailto:Collierj@nu.ac.za]
> Sent: Monday, October 13, 2003 12:23 PM
> To: loet@leydesdorff.net; fis@listas.unizar.es;
> marijuan@posta.unizar.es
> Subject: RE: [Fis] CONCLUDING THE SESSION
>
>
> Watts are in units of energy. Degrees Kelvin (temperature)
> are in average energy per degree of freedom. The units are
> both energy. So entropy is dimensionless. Nonetheless, I
> agree that Shannon information and entropy should not be
> confused. The story is a longish one, but there is a way to
> bring the two into line. See my paper Causation is the
> Transfer of Information on my web page at
> http://www.nu.ac.za/undphil/collier/papers/causinf.pdf for my
> first attempt -- well, actually there is an earlier one
> written for a cog sci collection, Intrinsic Information:
> http://www.nu.ac.za/undphil/collier/papers/int> rinfo.pdf
>
> I am
> now working on a book chapter that uses
> Barwise and Seligman, Information Flow, assumes all
> classifications are dynamical, and gets the same result as my
> causation paper as a corollary. The book chapter is part of a
> four chapter book arguing for dynamical/structural realism in
> which the nature of the world is given by the topology of
> information flows.
>
> John
>
>
> Professor John Collier
> Philosophy, University of Natal
> Durban 4041 South Africa
> T: +27 (31) 260 3248 / 260 2292
> F: +27 (31) 260 3031
> email: collierj@nu.ac.za
>
>
> >>> "Loet Leydesdorff" <loet@leydesdorff.net> 10/08/03 04:45PM >>>
>
> > Let me add that re-reading the recent messages another future
> discussion to
> organize should revolve around entropy --the numerous
> misunderstandings, misconceptions, etc. surrounding it,
> precisely in its connection with Shannon (for instance,
> arguing with Loet, thermodynamic entropy is indeed
> 'dimensionless': it has units, but no dimensions, as
> 'temperature' itself has dimensions of energy--see John
> Collier excellent posting in this list about the subject--
> and so they cancel each other). I also would like to
>
> The probabilistic entropy (bits) should not be confounded
> with the thermodynamic entropy which is defined in terms of
> Watts/Kelvin, isn't it? The system of reference is in the
> thermodynamic case the movement of particles in terms of
> energy and momenta.
>
>
>
> --------------------------------------------------------------------
> Please find our disclaimer at http://www.disclaimer.nu.ac.za
> --------------------------------------------------------------------
> <<<<gwavasig>>>>
>
Received on Mon Oct 13 20:28:02 2003

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET