Re: [Fis] A definition of Information (II)

From: Stanley N. Salthe <ssalthe@binghamton.edu>
Date: Wed 03 Mar 2004 - 22:17:47 CET

Pedro said:

> Going to Stan's on entropy and the living, as stated last day, I disagree
>about the urbi et orbi 'relevance' of entropy in all realms of life. The
>emphasis has to be put, in my opinion, on the special 'informational'
>nature of the living, always in the making, adaptively combining
>production/degradation activities of its structures as a result of the
>communication with the environment. The "Creative Destruction" a la
>Schumpeter is ocurring inside cells (protein degradation), in
>multicellular organisms (apoptosis) and brains (learning/forgetting), in
>economic systems, in cultures... but not basically as a result of entropy
>---rather as the hallmark of informational ways of existence, of being
>in-formable, keeping oneself in-formation, in self-production.
> -snip-
> Let me conclude emphasizing the need of a discussion in-depth upon the
>many corners and pockets around the information-entropy interrelationship.
>The fine exchanges we had last year between Igor and Shu-Kun, amonmg
>others, should conduce in the coming months to a more comprehensive
>axiomatic of 'information physics"; a convergence with the developments of
>other parties on 'information physics' looks feasible. Apologies that I
>cannot get ahead on other very interesting points about meaning in recent
>messages (Viktoras, Soeren, Steve, Rafael, Loet...).

My reply:
     While I do not deny the importance of the approaches Pedro here
champions, I wonder why he seems eager to deny the relevance of mine. I
will try to guess. It is the matter of final causality, which is where I
see the importance of physical entropy production to be lying. Nothing
would happen at all if it were not for the fact that the Universe is very
far from equilibrium, and acts continually to establish a new equilibrium.
This tendency is the Second Law. Finality was ruled out of science by
Francis Bacon, but that science has not had notable success in dealing with
complexity. I think we need to use complex causal analyses, like the
Aristotelian scheme, in order to understand complex systems. As an
example, if a complex system is metastable and ready to change, but could
go in any of several ways, each associated with a different set of kinetics
imposed by various different informational boundary conditions, then, using
the Second Law we could predict that the system will change in such a way
that its rate of entropy production (from external energy gradients) will
be the greatest possible consistent with continuance of the system. That
is, the informational constraints will serve the Second Law, which will
choose among them.
     A second point has to do with informational entropy, or variety. It
has been shown that in an expanding or growing system variety must increase
even as information itself does. These two feed upon each other -- the
more information in a system, the more variety it can generate, and the
more variety there is, the more information can be established out of it.
It has been argued (e.g., Brooks and Wiley) that the inrease in variety in
an expanding system can be seen to be another aspect of the Second
Law. One way to see this is to note that wherever there are informational
consraints, there is a place where physical entropy may be produced. So,
the greater the variety of information (as in number of species in a biome)
the more different kinds of energy gradients can be dissipated
simultaneously.
     Now, if these understandings are no news at all, then of course, there
is no need to speak further about entropy, and we can focus instead upon
the details of informational constraints as they inform kinetics of various
kinds in all sorts of systems.

STAN

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Wed Mar 3 22:06:53 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET