RE: [Fis] Objects: Irreversibility of the Foundations

From: Aleks Jakulin <jakulin@acm.org>
Date: Fri 08 Oct 2004 - 14:04:28 CEST

Robert wrote:
> Mutual information is the information-theoretic homolog to
> conventional correlation. It's not too surprising that you
> would find it applied in a great many fields. It has been the
> cornerstone of my own work in ecology, e.g.,
>
> Ulanowicz, R.E. 1997. Ecology, the Ascendent Perspective.
> Columbia University Press, New York.

Actually, I was not writing about mutual information. Both mutual and
conditional mutual information are given already by Shannon, and as almost
everyone knows about that, it would be wrong to claim that it is independent
reinvention. What I was referring to is interaction information, a
generalization of mutual information to more than two variables. It was
independently reinvented several times, judging by a number of different
names different authors attached to the same quantity, and the fact that
they all considered it novel. In their respective fields, they also found it
useful for some purpose, indicating that 3-way interactions are a practical
conception. Another such concept is total correlation. If anyone is
interested in this review, it is on pages 8-10 of my paper at
http://arxiv.org/abs/cs.AI/0308002

> The tradeoff and interplay between mutual
> information and its complement reveals volumes about a
> dynamic that has remained cryptic in conventional statistics.

Agreed, there has been too much fixation on linear multivariate normal
models in statistics. My current work attempts to reconcile statistics, both
frequentist and Bayesian, with these information-theoretic notions, by
intepreting entropy either as a utility/loss function or as log-likelihood.
This way we can also put error bars and confidence intervals on
information-theoretic quantities, develop unbiased estimators of entropy and
mutual information, and test the significance of the hypothesis of positive
mutual information.

Terry wrote:
> Within mathematics, not even so basic a concept as number is
> immune. At least three entirely different metaphors for
> number exist --"objects in a set," "points on a line," and
> "units of measurement." Lakoff and Nunez show how entailments
> from each metaphor, plus blends between them, pave the way
> for entirely different branches of study.

True, and these divisions in mathematics run deep trenches in other fields:
frequentist probability theorists perceive probability as something you do
with objects in sets, whereas statistics is more about probability as points
on a line (to be precise, measure theory dominates as the paradigm of
choice). The chasm between quantum gravity and relativity theory is also due
to this: relativity is geometric (point on a line, spacetime), while quantum
gravity is algebraic (objects in a set, discrete lattices). And I could go
on to Shannon's entropy, which is an essentially algebraic conception, where
you sum, not integrate: it gets messy if you try to apply it to geometric
notions, such as symmetry. In this context, I'd recommend reading the
transcript of M. Atiyah's lecture http://duch.mimuw.edu.pl/~sjack/atiyah.ps

The main message here is that quantization and continuity cannot be easily
reconciled: they are separate views of reality, sometimes complementary,
sometimes redundant.

At this point, I must also thank Karl, who noted on this list that
enumeration is the mother of consilience, something I should have done in
the previous post.

Aleks

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Fri Oct 8 14:06:20 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:47 CET