RE: [Fis] entropy as average expected loss

From: Loet Leydesdorff <[email protected]>
Date: Wed 14 Apr 2004 - 03:12:55 CEST

> Clausius' (thermodynamic) entropy is very different from
> Shannon's entropy. Thermodynamic entropy is a narrow concept.
> Thermodynamic entropy is something one can measure. Shannon's
> entropy is instead a general one. It is a property of a
> probabilistic model. Boltzmann's work was trying to explain
> phenomenological entropy with a probabilistic model.

Yes, the Shannon entropy is the more interesting part and let me explain
why. I derived in my email of last Sunday the equation:

S = k(B) N H

H (the Shannon entropy) is a measure for the dividedness (or order if
one adds a minus sign). It is a mathematical property of the system. N
and k(B) are constants that serve to relate the dividedness to the size
of a physical system (N) and the dimensionality of the thermodynamic
entropy k(B) (measured in J/K), respectively. These two constants make
the mathematical (statistical) entropy to a physical entropy.

The Shannon entropy is more general because it can be applied to systems
other than the physical one. It abstracts from the dimensionality of the
thermodynamic system (J/K) and from the size (N) and provides us with a
general measure of order. Before we can apply this measure of order,
however, we need a system of reference, just like we chose the physical
system as the system of reference by introducing N and k(B). The choice
of the system of reference provides the measurement of the information
(using the Shannon formulas) with meaning. For example, if we choose the
physical system of reference and use S instead of H, we provide the
concepts with meaning using the discourse of physics. We can then
measure the thermodynamic entropy; the factor k(B) warrants that the
macroscopic entropy (Clausius-Clappeyron) is equal to the one in
statistical mechanics. The discourse of physics is thus made consistent
by the Boltzmann constant.

When one applies the formalisms of Shannon's entropy to other systems of
reference, these systems of reference provide substance to the otherwise
only formal equations. For example, one can apply the formalisms to the
circulation of money in an economy. The particles are then not physical
particles, but quantities of money. One may wish to measure quantities
of money, for example, in US dollars or Euros and then one needs to
introduce other constants for providing the equations with a substantive
interpretation. Similarly, we can use the measure to indicate the
dividedness in the school system (e.g., segregation) and then because of
its marvellous mathematical richness the formalisms (of statistical
decomposition analysis) allow us to disaggregate the dividedness
precisely in terms of classes in the school, or in terms of regions,
etc. The mathematical richness is available because Shannon chose H and
the algorithms of physics can thus be used as a heuristic for the
computation in other systems (without having to be derived anew).

For example, one can "freeze" the system and thus determine the number
of heterogeneous circulations that interact in the system under study.
In a previous email I already mentioned this application with a
reference to Smolensky (1986). For example, in automatic reading one is
able to construct a so-called Boltzmann machine which is able to read in
terms of (e.g.) 26 distinct characters because 26 circulations are
orthogonal in this system (as attractors). The orthogonal dimensions
co-vary at temperatures higher than "zero degrees K" and this
interaction can be measured by using the mutual information. I placed
quotation marks around "zero degrees K" because we would still have to
specify the equivalent to this physical notion in terms of the specific
system(s) under study. This specification requires a substantive
discourse or equivalently a specific theory of communication about the
system under study (a hypothesis).

Thus, Shannon's choice to equate H with the notion of entropy was very
fortunate. It made the rich domain of equations and algorithms that had
been studied in the century before available to the study of the
dividedness/order/organization of systems other than physics. All these
forms of organization (and self-organization) can be studied in terms of
their probabilistic entropy. The formal measurement remains in bits of
information, but the substantive interpretation requires the
specification of the substance which is distributed (circulating) in the
system under study. Each specific substance circulating provides us with
a special theory of communication (a hypothesis), while the information
calculus provides us with a mathematical formalism to study these
processes of communication. From this perspective, physics can be
considered as the special theory of communication in which particles are
assumed to be circulating and colliding in terms of their (conserved)
energy and momenta. The formalism then enables us to compute the
dissipation in these otherwise conservative systems. Other systems
(e.g., living systems) may be non-conservative by their very nature. The
substantive theories can be expected to be different. They form the
monads which are connected by the formalisms that provide us with a
heuristics.

With kind regards,

Loet

  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/
 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society
Received on Wed Apr 14 03:14:33 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET