Again on physics, entropy and information

From: <[email protected]>
Date: Thu 30 May 2002 - 14:52:13 CEST

Dear Werner and FISers,
You will find hereunder some comments to what Werner wrote on May 30th.

>Dear FISers,

>I am following our discussion already a while without interfering.
>After reading again about 100 pages of discussion
>I would like to make the following remarks:

>We should never forget in our discussion about basic problems that
>the information concept has a very solid fundament in physics,
>communication, computers, internet etc.. We are living in an
>information society and several concepts are simply fixed
>already they cannot be traded freely.
>Anybody who forgets all this will be like Antaeus.
>Let me mention a few concepts which as I believe should not be
>thrown away lightheartly:

>- information is a binary relation between a sender and a receiver
>(I cannot understand why several FISers like to consider information as
> property of a system itself, like mass, energy, momentum in physics.
>To me and I believe to any engineer,
>information is binary as interaction in physics it is a relation between
>two things),

As you write in your FIS 2002 contribution, there is also the "bound
information" as "potential information (like disks, tapes, books)".
I understand the "information as property of the system" as being
this naturaly existing bound info. Ex: the weight, volume and
atomic composition of a stone. Existing within the stone as "being
of the stone" but not available as measured (interpreted) figures.

>- flow of information between sender and receiver is connected
>with decrease of uncertainty (why we should give up such an
>extremely fruitful concept due to Shannon. Communication networks
>and by the way also our internet discussion would break down.
>We should not ignore the reality of communication based on this).

>- information flow is connected with entropy flow, this is expressed
>by the Boltzmann Shannon formulae.
>(Entropy in physics is one of the fundaments of any physical theory
>the number of technical applications from cars to refrigerators are
>uncountable).

>Of course there many other aspects which are more or less fixed by
>practice.

>To me the unsolved problems are:

>- what is the exact relation between physical entropy and information
>entropy (there were very interesting remarks by Shu-Kun Lee on this
>L = I + S but things are much more complicated, by the way also
>the information content relation to stability is not a clear concept).

>- is there any measure of the meaning of a message (this is indead
>a very complicated question, Shannons measure expresses clearly
>only the quantity not the meaning)

Prior talking about the measure of a meaning of a message, I feel
we need first to define the meaning of a message/information.
One way to this is to consider that a meaningful information
is generated by a system submitted to a constraint. When the
system receives an incident information that has some connection
with the constraint, the system will generate a meaningful
information that will be used by the system to satisfy the
constraint (the system can also transmit the meaningful
information to other systems, if it is needed for the
satisfaction of the constraint). You can find some details on this
in my FIS 2002 contribution.
Now, regarding the measure of the meaning, as you write, it has
obviously nothing to do with the Shanon measure of quantity of
information. The only possibility I see is to somehow measure
the participation of the meaningful information to the
satisfaction of the constraint of the system. But this is not
easy to quantify..... Any idea ?

>- the specific properties of information in biological/social systems

One could say that information incident on biological/social
systems can generate specific meanings in these systems,
depending upon the relations these information have with the
constraints of the systems. So the meanings generated in the
systems will depend upon two elements: 1) the nature of the
systems (theirs constraints to be satisfied), and 2) the
relation existing between the the incident informatoin and
the constraint of the system.

>- selforganization of information processing systems in the
>evolution of life.

>By the way to me information is not a physical concept, it is not
>contained in the axioma of physics. There was no information before the
>first living system appeared on earth and as far as I see information
>processing is always at least indirectly connected with biological
>social technical systems.
>It would be very interesting for me to learn about an
>Information processing system absolutely independent from life.

I agree that information exists only as part of something
(modulation of a signal, difference in chemical density,
any variation of energy).
Regarding information processing, I would look at the system on
which the information is incident. Does the system have an aim,
a target, a finality, an intention (i.e. a constraint that
must be satisfied)?
If yes, then I would consider that the system can process the
information (receive the information to use it, in other words
generate internaly a meaning). If we consider evolution
(increasing complexity) as a finality (even localy), then I feel
it is posible to talk about information processing independent
of life. Would you agree ?

>Summarizing I thing this discussion is extremely inspiring and useful
>but sometimes we have to come back to our earth.

Yes. We need both head in the sky and feet on earth.

Regards

Christophe Menant
Received on Thu May 30 14:53:34 2002

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET