RE: [Fis] Replies and new questions / part 1

From: Loet Leydesdorff <loet@leydesdorff.net>
Date: Sat 10 Apr 2004 - 08:31:21 CEST

> If order (nonsymmetry) is represented by information I,
> disorder (symmetry) by entropy S, you are talking about I/N
> and S/N or about additivity or extensivity. For convenience,
> let us put L=I+S. If the structure is not changed, I/S will
> not change if only N increases. This means I and S will
> increase or decrease together.
[...]
> In many cased, additivity does not hold true. For example, if
> we assess the biodiversity of microorganism. Among the
> 1000000 samples (N=1000000) we found a new bacteria. Its
> information is I=log1000000 per sample for this sample. This
> means we calculate the maximum information (Its symbol has
> been put as L, L=I+S) as L= x log N = N log N, where the variable x=N.
 

In other words: L is the maximum information, S is the (Shannon) entropy
H, and I (information) is equal to what is usually called the
redundancy. I am not in favour of this confusion of terms, particularly
because S is often used for the thermodynamic entropy (unlike the
Shannon entropy H). And I is used by some authors (e.g., Theil, 1972)
for the dynamic extension of Shannon's H. I'll follow the standard
notation below.

Let me provide some relevant derivations.

1. statistical entropy (Boltzmann)

Boltzmann explained that in entropy can be provided with a statistical
interpretation (in addition to the traditional one of Clausius). The
entropy of a system (S) is related to the number of equally possible
states (W) with the Boltzmann constant k(B). As follows:

    S = k(B). ln(W) (1)

k(B) is the so-called Boltzmann constant: k(B) = 1,381. 10^-23 J/K.
Notet the dimensionality because the number of possible states (W) is
only a dimensionless number while S is also defined as Q/T (in classical
thermodynamics). It can be derived that the number of equally possible
states W is equal to (N!)/ N(1)! N(2)! .... N(i)! and therefore:

    ln(W) = ln(N!) - Sigma(i) ln(Ni!) (2)

(i) is the ith compartment and N(i) is the number of particles in that
compartment.

It can be shown (Stirling) that

    ln(N!) = N ln(N) - N (3)

It follows:

ln W = ln N! - Ó ln Ni!

         = (N ln N - N) - Ó ( Ni ln Ni - Ni )

         = N ln N - Ó ( Ni ln Ni)

Then:

ln W = N ln N - Ó ( Ni ln Ni)

-ln W = Ó ( Ni ln Ni) - N ln N

(- ln W) / N = Ó ( Ni/N ln Ni) - ln N

If fi = Ni/N , then:

(-ln W) / N = Ó fi ln Ni - ln N

-ln W = N Ó fi ln fi

Statistical entropy is thus defined:

 

        S = - k(B) N Ó fi ln fi

        S = k(B) N H and H = - Ó fi ln fi

H is Shannon's probablistic entropy. Note that H is dimensionless. It
can therefore be applied to other systems as a statistics. The relation
formulates that a change of the distribution may cost energy.

However, not all energy changes imply changes in the information flux
and therefore the Szilard-Brillouin relation:

    DS >= k(B) DH

2. probabilistic entropy (Shannon)

H is a measure for the expected information content of a probability
distribution at each moment in time. If the distribution if
equiprobable, H is maximal because :

    p(i) = f(i) / N

    H = - Ó p(i) log p(i)

        = - Ó f(i)/N log f(i)/N

        = Ó f(i)/N log N - Ó f(i)/N log f(i)

        = log(N) - Ó f(i)/N log f(i)

Thus, the maximum information content of a distribution is log(N). This
is equal to equiprobability.
When the probabilities are "asymmetrical", there is less information
content. This sounds counter-intuitive, but the interpretation is as
follows.

The distribution {1,1} or {50,50} provides us with:

        H = - 0.5 2log(0.5) - 0.5 2log(0.5) = 0.5 + 0.5 = 1 bit of
information

The distribution {1,0} provides us with:

        H = - 1 log(1) - 0 log(0) = 0 + 0 = 0 bit of information.

In this latter case we are completely certain in the prediction that the
one is the case and the other not. Thus, there can be no surprise. In
the former case, we are completely uncertain; thus, information is
expected.

This becomes intuitively more clearly when one uses the dynamic
extension of the Shannon entropy: the expected information content of a
change of the distribution, that is, an event has happened and an
probabilistic entropy was generated:

    I = - Ó q(i) log q(i)/p(i)

In this formula Ó q(i) represents the a posteriori distribution and Ó
p(i) the a priori one. If one of the terms p(i) is zero, the prediction
is that someting will not happen. If it happens yet, and the
corresponding q(i) is therefore larger than zero, the event is a
complete surprise and the information becomes infinite. Emergence can
therefore not be predicted. It is a complete surprise.

When we consider time as a degree of freedom, we can also invert the
time axis and then evaluate emergence from an ex post perspective. The
positions of q(i) and p(i) are then exchanged. One can then backtrack
the roots of the emergent order to the chaos (uncertainty) from which it
emerged. Therefore, Prigogine used the title "Order out of chaos". The
emergent order can thus be evaluated quantitatively from an ex post
perspective. The ex post perspective, however, is knowledge-based
(unlike the ex-ante perspective which is based in the historically given
or natural order of the events).

A system which uses time as a degree of freedom can also be considered
as an anticipatory system (Rosen, 1985). It uses a model of itself for
the prediction (from an ex post or reflexive perspective). These systems
can be studied empirically using information theory and they can be
simulated (Dubois). Aleks Jakulin has elaborated on these two options in
a previous mailing.

I hope that this contributes to the clarification of the foundation of
information systems as the topic of this list.
With kind regards,

Loet

  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/
 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society

 
Received on Sat Apr 10 08:34:45 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET