Re: [Fis] Objects: Irreversibility of the Foundations

From: Robert Ulanowicz <ulan@cbl.umces.edu>
Date: Thu 07 Oct 2004 - 15:38:54 CEST

On Thu, 7 Oct 2004, Aleks Jakulin wrote:

> 1. Independent reinvention: another path to consilience
>
> Two years ago, I have been trying to solve a particular problem in machine
> learning, my field of research. Looking around, I found a particular
> generalization of mutual information, which was constructed by my colleague.
> I performed numerous experiments, and it performed well. Then I tried to
> check if anyone thought of that before, and, empowered by Google, I was
> surprised to find virtually the same formula (re)invented independently with
> different names over the past 50 years in: biology (Quastler), psychology
> (McGill), information theory (Han, Yeung), physics (Kikuchi,Cerf&Adami
> arXiv:quant-ph/9605002), neuroscience (Brenner), robotics (Yairi et al), and
> closely related notions in game theory (Grabisch & Roubens), complexity
> theory (Gell-Mann), statistics (Darroch), and chemistry (Kirkwood). I could
> go on, but I wonder how many I already missed. Most researchers, but not
> all, arrived at it by employing the inclusion-exclusion principle outside of
> strict context of set theory. That was my encounter with consilience: but it
> was not a discovery of applicability of a different method in an unexpected
> domain, but the independent discovery of the same hypothesis (actually a
> tool) in a number of different domains: isn't this the same.

Aleks,

Mutual information is the information-theoretic homolog to conventional
correlation. It's not too surprising that you would find it applied in a
great many fields. It has been the cornerstone of my own work in ecology,
e.g.,

Ulanowicz, R.E. 1997. Ecology, the Ascendent Perspective. Columbia
University Press, New York.

The advantage that one acquires by working within the realm of information
is that it forces one (allows one?) to deal with the complement to mutual
information, namely conditional "entropy". The tradeoff and interplay
between mutual information and its complement reveals volumes about a
dynamic that has remained cryptic in conventional statistics.

Obviously, the homolog to conditional entropy must exist in the
conventional approach, but one's attention is not forced upon it. See:

Zorach, A.C. and R.E. Ulanowicz. 2003. Quantifying the complexity of flow
networks: How many roles are there? Complexity 8(3):68-76.

The best,
Bob

-------------------------------------------------------------------------
Robert E. Ulanowicz | Tel: (410) 326-7266
Chesapeake Biological Laboratory | FAX: (410) 326-7378
P.O. Box 38 | Email <ulan@cbl.umces.edu>
1 Williams Street | Web <http://www.cbl.umces.edu/~ulan>
Solomons, MD 20688-0038 |
--------------------------------------------------------------------------

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Fri Oct 8 12:25:07 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:47 CET