Re: [Fis] bioinformation and entropy

From: by way of <[email protected]>
Date: Thu 13 May 2004 - 09:53:56 CEST

Subject: Re: [Fis] bioinformation and entropy
Sent: 5/13/04 1:12 AM

Dear Pedro et al,

If I may be so bold as to join the discussion at so late a date, I would
like to take up the issue of quantum entropies and biocomputing. I have I
been feasting on this very stimulating discussion for some weeks now. Pedro
had suggested I weave in some of the late Michael Conrad's work on entropies
in biological systems.

The remarks of Michael Devereux on the Shannon H make a nice lead-in here.
One of the recurring features of Michael Conrad's work was his use of
Shannon entropy and related functions. He originally used it to discuss
statistical and hierarchical aspects of ecosystem organization, but then
went on to connect to deeper theories about the limits and tradeoffs of
information processing in nature. As his student in the 1980s, I was
fascinated the by latter in particular, and had many conversations with him
about it, but never quite understood his fixation on H in this context.
Then, about 2000, ironically about a month after his death, I noticed how a
connection with quantum (von Neumann) entropy seemed to give it greater
interest. I wrote a little piece on it in Biosystems (2002), and, if you
will let me give a rough short summary of it here, I think it will connect
smoothly to some of the ongoing discussions.

In this part of his work, Michael would speak of biological systems as very
large but finite Markov systems, flickering probabilistically through many
states. He partitioned the system into "biota" and "environment", say B and
E. He would speak of the entropy of the biota as H(B), being the familiar -
Sum p(b) log p(b), summing over all biological states b. Similarly, one
defines H(E) and the joint entropy H(BE). Conditional entropies are defined
as H(E|B) = H(EB)-H(B), etc.

A crucial move was to define the "adaptability" of a biological system as
the maximal value of H(E) such that B (life) was sustainable. He called this
H(E!), E! coming to mean the "most stressed" environment and B! the most
stressed biota. His next step, the heart of this theory, was to interpret a
very simple formula in a very creative way. The formula is easily derivable
from the definitions:

H(E!) = H(B!) + H(E!|B!) - H(B!|E!).

Just as philosophers interested in uncertainty will ponder the true meaning
of Bayes theorem (which to mathematicians is a trivial consequence of the
definition of probability), Michael meditated at length on the
interpretation of this formula in the context of living systems and
information. He identified the terms as follows:

H(E!) : adaptability
=
H(B!) : biotic behavioral/dynamic diversity
+
H(E!|B!): environmental fluctuations
-
H(B!|E!) : autonomy of biota

He noted that since the last term was subtracted, greater autonomy led to
lower adaptability. (He applied this lesson in areas as grand as global
sociology!) The autonomy term he also associated with information processing
power. This led (supplemented by other, independent arguments) to his
Tradeoff Principle: that systems, natural or artificial, could not
simultaneously have high levels of programmability, adaptability, and
efficiency. One consequence was that he was very pessimistic about
capturing the essence of life in digital simulations.

What I noticed in 2000 was that if you reworked this using von Neumann
entropy rather than Shannon entropy, there was a nicely strange
interpretation available. First, you would have to view this as a truly
quantum system. We might be talking about some kind of neuronal cytoskeletal
system in a Hameroff-Penrose world where quantum entanglement is a source of
information processing power. (Koichiro has also written about entangled
quantum coherence in living systems, more convincingly I believe.) So the
joint system EB would be a tensor product, and the probabilities would be
given by a density matrix rho. The von Neumann entropy is just S = - Sum p
log p, where now we are summing over the eigenvalues p of rho. Conditional
entropies are defined the same way as in the Shannon case. But now,
interestingly, they may be negative. That is, for example, we can have in
certain situations H(B|E)<0. This, oddly, means the joint system BE may be
less uncertain than E alone. This will happen in entangled systems. Since
H(B!|E!) makes a negative contribution to adaptability, a negative-valued
H(B!|E!) suggests that quantum mechanics enables greater adaptability,
growing out of non-classical supercorrelations between living systems and
their environment.

Well, that's the rough overview. I realize that placing a heavy burden of
creative interpretation on simple formulas is not always intellectually
healthy. Still, I found this habit of Michael's to be very stimulating.

A while ago Pedro mentioned that:

> My personal opinion is that MOLECULAR RECOGNITION becomes the best
> systematic thread to follow in the labyrinthic paths of the cell

and suggests that this transcends mere "bioinformatics." Yes-- molecular
recognition is where we leave the straightforward world of syntax (strings
of nucleotides or amino acids) and get into the tactile, dynamic, folding,
entangled, virtually uncomputable semantics of life. I would very much like
to come to understand what makes this molecular substrate of life so
amenable to evolution (whereas straight syntactic structures are not). It
seems that quantum entropy-related information measures must be key. I look
forward to being a somewhat more active participant here.

Kindest regards,
-- Kevin

P.S. Shu-Kun Lin asked a few weeks ago about "entropy" in ancient Greek.
According to Liddell and Scott, the standard scholarly Greek-English
lexicon, entrope' means "a turning towards", but also a "dodge", and, in
Homer "entropia" is a "trick"! Of course, it was re-coined by Clausius who
patterned it after "en-ergy" (en-tropos literally as inherent
transformability) rather than after its original Greek meaning. But, with
demons and all, I think we can say the "trickiness" has not disappeared!

__________________________________
Kevin Kirby kirby@nku.edu
Stein Professor of Biocomputing
Mathematics & Computer Science, Northern Kentucky University
Highland Heights, KY 41099 USA

    

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Thu May 13 09:24:12 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET