Re: information as interaction

From: John Collier <[email protected]>
Date: Fri 31 May 2002 - 12:06:57 CEST

At 01:10 AM 31/05/02, James wrote:

>Finally, Jerry, I liked very much your discussion of information and
>thermodynamics, and I just wanted to add the following quote in support:
>
>"the Second Law and Shannon's Law are two different statements; what
>they have in common is a mathematical formalism. Such sharing of
>formalism is not at all unusual in theoretical physics: thus water waves
>and light waves obey the same differential equations---but nobody would
>conclude from this that light and ripples on water are the same thing."
>(Walter Elsasser, Reflections on a Theory of Organisms, Johns Hopkins
>UP, 1998, p. 46.)

Oh, this can be so misleading. First, water and light don't share the same
differential equations. Some aspects of the equations are the same. Others
aren't. Second, for a long time it was assumed that water and light waves
were both forms of mechanical energy, with light waves passing through
an ether. It was quite shocking to find that light waves do not require a
mechanical medium, and the idea was given up only when it was clear
that the alternative generates really severe problems for kinematics.

Ther most striking difference between thermodynamic entropy S and
Shannon entropy H is that the latter can decrease by passing through a
passive filter (process). According to the Second Law, S cannot
decrease except thought the application of work (that is one version). So,
Shannon entropy H can violate the Second Law, if Shannon entropy and
thermodynamic entropy are the same thing, i.e., H = S by
definition.

What can we conclude from this? We can conclude that there are possible
applications of the Shannon formalism that are not applications of the
formalism of statistical mechanics. It does not follow that all applications
of Shannon entropy are not applications of the formalism of statistical
mechanics. In other words, there are some models of H that are not
models of S, in the (informal) model theoretic sense. By informal here,
I mean the sense of model in which a situation or an application of
a theory can be a model (see, e.g., W. Stegmueller, the Stucturalist
Approach to Theories, 1989)

As Jerry mentioned, a conversion factor is required at some point, since
all applications of statistical mechanics involve energy, but there is no
explicit role for energy in Shannon's theory. Jerry mentioned the significance
of time. Jonathan Smith has shown that if you set up a statistical approach
for information along the lines of statistical mechanics, ignoring energy
issues (this is an abstraction from energy and matter, basically) then the
units of information come out as a rate (1/t). This is not surprising, since
temperature is an average kinetic energy per degree of freedom, by
dimensional analysis. Take away the energy, and you get a rate.

How to add the energy back in? It has been suggested for some seventy
years now that Botlzmann's constant is the appropriate conversion. It
works quite well if and only if one uses only applications of Shannon
in which the "messages" are the complexions of a physical system.

Interestingly, this is not enough to ensure that H=S, since there is
the small matter of sign. Brillouin and Schroedinger both argue
that the sign for information should be negative. This resolves the
spontaneous information decrease problem, and has a lot more
going for it as well.

In any case, it is much too fast to say that Shannon's formalism
and Boltzmann's have only in common a formalism. There are
ways to bring them into line, and the resulting theories can be quite
interesting and productive. The problem is in understanding the
relations between different uses of the word "information", not
in getting the real Mr Information to stand up. There is no more
a real and unique information than there is a real and unique
energy, temperature, work, or for that matter, meaning. All of
these concepts are relational in their use. None of them come
uniquely in atoms. (Action does seem to come only in atoms,
which is worth reflecting on, but right now I just want to make the
point that the relational character of information is not strange or
unusual in this respect among general and fundamental concepts
-- energies as used in physics are always energy differences
of a specific kind -- there is no such thing as energy without
a form -- depending on what one focuses attention, the energy of
something can be quite different -- the problem is to focus on the
relevant -- same for various uses of information).

In any case, I think it would be as much a mistake to require
that all information must be intentional, or biological, or whatever,
as to require a priori that all energy must be mechanical. Let
us wait and find out.

John

John

----------
Dr John Collier john.collier@kla.univie.ac.at
Konrad Lorenz Institute for Evolution and Cognition Research
Adolf Lorenz Gasse 2 +432-242-32390-19
A-3422 Altenberg Austria Fax: 242-32390-4
http://www.kli.ac.at/research.html?personal/collier
Received on Fri May 31 12:08:59 2002

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET