RE: [Fis] Probabalistic Entropy

From: Loet Leydesdorff <[email protected]>
Date: Fri 16 Apr 2004 - 08:42:19 CEST

> I expect, Loet, that by systems other than the physical one, you mean
> those systems not traditionally described by physics. You mention
> circulation of money in an economy, the division of classes
> in a school, the characters in a Boltzmann machine at temperatures
above
> zero K, and living systems. I usually conceive of physical systems as
tangible,
> observable ones; systems that impress our human senses. So, I tend to
> categorize all the systems you've mentioned as physical ones.
> I trust I understand your idea, Loet.

Dear Michael and colleagues,

My interest in Shannon's entropy is particularl because it enables us to
study systems which we cannot observe so easily with other means. For
example, interhuman communication cannot so easily be operationalized in
terms of observables, but it can excellently be measured in terms of how
the distribution of what is communicated changes.

The easy one in this case in the distribution and redistribution of
money and commodities on the market. In sociology, money is considered
as one of the symbolically generalized media of communication.
Symbolically generalized media of communication have value and provide
meaning to the exchanges. Truth and power are other of such media. In
scientific communication, for example, we exchange in terms of
statements, but with reference to standards of validity (~ truth?).
Sometimes, we can use equation to accelerate the communication. This is
a highly symbolically mediated form of communication. The
"referentielles" can be studied as the eigenvectors of the networks of
communication emerging from the communications and reproduced by them.
But these "referentielles" -- I use this French word deliberatily --
cannot be directly observed. They have the status of hypotheses.
However, entertaining the hypotheses may enable us to understand the
communication more than without them. They allow us to specify how the
observed communications deviate from the expected ones.

Let me take the opportunity to add the following:

Shun-Kun Li has let me know (off-line) that his S/L is equal to H/H(max)
in terms of information theory. He prefers his own notation. It seems
confusing because S is often used for the Boltzmann entropy and S <> H
because S = k(B) * N * H. However, after the division the issue of the
dimensionality is resolved and one can discuss the percentage
uncertainty in both languages (that of thermodynamic entropy and that of
probabilistic entropy).

However, it does not make much sense to discuss exchanges of words like
these in terms of the thermodynamics of these exchanges because we are
interested in what the words mean. We are able to communicate the
meaning with words and sentences. The circulation of these
representations can be modeled as a probabilistic process.

With kind regards,

Loet

  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/

 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society
Received on Fri Apr 16 08:43:28 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET