Re: [Fis] Re: What is information ?

Re: [Fis] Re: What is information ?

From: John Collier <[email protected]>
Date: Mon 03 Oct 2005 - 09:43:00 CEST
Folks,

Some remarks on formalization and its limits follow, interspersed.

At 03:15 AM 2005/10/03, Steven Ericsson Zenith wrote:
Dear Loet,

I understand your concerns but how else are we to proceed?  Shannon's model is not nullified but it does not appear to characterize all that we would wish it to.  I am not asking for a full rewrite I am simply observing that we need to extend information theory into the area where we lack rigor - where the model seemingly needs to be extended.  I am simply contending that we cannot deal with the notion of information in isolation.

In my view we need to develop two new models that complement current physical theory with the mathematical rigor of Shannon: a theory of organism (how sentient entities come to be) and a theory of semeiotics (how sentient entities operate) - where semeiotics includes a theory of communication and what I will call "memeiosis" which describes the exchange of "information" and the development of concepts by individuals in groups of sentient entities.  Done rigorously, this is inevitably a mathematical theory of meaning as you suggest.

For reasons why I doubt that a mathematical theory of meaning is impossible, see Pragmatist Pragmatics: The functional context of utterances, forthcoming in Philosophica. Konrad Talmont-Kaminski and I argue that a formal pragmatics is impossible, and that a theory of meaning requires pragmatics, therefore a formal theory of meaning is impossible. However once we have the meaning of a communication fixed, we can then use formal methods that are well known. The argument uses Barwise and Seligman's version of information flow, and draws on work in formal pragmatics by Montague, Kaplan and Stalnaker, situation semantics of Barwise and Perry, and insights from Peirce. There is a draft version at http://www.nu.ac.za/undphil/collier/papers/pragmatist%20pragmatics.pdf

As to "expected information can only be provided with meaning by information systems"; I think this is problematic without a clear definition of what you mean by "information systems."  From my point of view computational models do not adequately account for sentience.

Nor for living systems in general if Robert Rosen's arguments in Life Itself are correct. He shows rigorously how a system can fail to be computable or mechanical in the sense of a terminating (Knuth) algorithm. Henri Atlan has made similar arguments, to the effect that certain molecular systems on which living systems depend have infinite "sophistication", which I take to be equivalent to Bennett's logical depth.

Interestingly, there is a review article from Science on the early stages of sea urchin development that claims that there are emergent properties of the developmental system (not reducible to the parts of the regulatory network) at early stages. I have not been able to evaluate this claim myself. The article is A Genomic Regulatory Network for Development
Eric H. Davidson,
1 * Jonathan P. Rast, 1 Paola Oliveri, 1 Andrew Ransick, 1 Cristina Calestani, 1
Chiou-Hwa Yuh, 1 Takuya Minokawa, 1 Gabriele Amore, 1 Veronica Hinman, 1 Ce�sar Arenas-Mena, 1
Ochan Otim, 1 C. Titus Brown, 1 Carolina B. Livi, 1 Pei Yun Lee, 1 Roger Revilla, 1 Alistair G. Rust, 2
Zheng jun Pan, 2 Maria J. Schilstra, 2 Peter J. C. Clarke, 2 Maria I. Arnone, 3 Lee Rowen, 4
R. Andrew Cameron, 1 David R. McClay, 5 Leroy Hood, 4 Hamid Bolouri 2

Development of the body plan is controlled by large networks of regulatory
genes. A gene regulatory network that controls the speci�cation of
endoderm and mesoderm in the sea urchin embryo is summarized here.
The network was derived from large-scale perturbation analyses, in combination
with computational methodologies, genomic data, cis-regulatory
analysis, and molecular embryology. The network contains over 40 genes
at present, and each node can be directly verified at the DNA sequence
level by cis-regulatory analysis. Its architecture reveals specific and general
aspects of development, such as how given cells generate their
ordained fates in the embryo and why the process moves inexorably
forward in developmental time.

1 MARCH 2002 VOL 1672 295 SCIENCE www.sciencemag.org



Ultimately our theory of information must reduce to our model of nature - and today that model appears incomplete.

And if Atlan, Rosen and Hood are correct, then it must remain forever incomplete, for relatively easily understandable formal reasons.

With respect,
Steven



Loet Leydesdorff wrote:
Dear Steven,
 
I agree that there are obviously two usages of the word information: a group of definitions akin to Shannon's mathematical definition and a group of definitions which define information as "what in-forms" a system (Varela). In the latter case, the system invests some meaning in the information or, more generally, positions the information in its framework. Perhaps, one could use the word "observed information" for this, while the Shannon-information remains "expected information."

Again, for an explanation of how these two ideas can be brought together and be used as the core for at least four philosophically current versions of causal connection, see my �Causation is the Transfer of Information�, Howard Sankey (ed) Causation, Natural Laws and Explanation (Dordrecht, Kluwer, 1999): 279-331. A copy is available at http://www.nu.ac.za/undphil/collier/papers/causinf.pdf.

 
What I like best about Shannon's approach is the mathematical character which frees us from specific semantics. When I read your mailings, for example, it seems that I have to buy a whole philosophy if I wish to understand it. From my perspective, this philosophy sounds like a meta-biology (unlike a meta-physics). Biological systems theory has helped us enormously in understanding how information can be stored into information systems and thus provided with meaning (by codification along the system's own axis). Social and psychic systems, however, can entertain (and perhaps communicate) horizons of meaning. Thus, they obviously have more degrees of freedom for processing information and meaning than biological systems (while the latter are also embodied?).

I am not at all sure that the latter have more degrees of freedom. The difference seems to me to be much more in terms of response time and extent (thought is faster than evolution, culture expands spatial extent, as does communication in general). A number of distinctions need to be made to sort this out, but Cliff Hooker and I made them in "Complexly Organised Dynamical Systems", Open Systems and Information Dynamics, 6 (1999): 241-302. There is a copy available at http://www.newcastle.edu.au/centre/casrg/publications/Cods.pdf. For more detail about the reducibility issues, and a more direct connection to meaning, see my Autonomy in Anticipatory Systems: Significance for Functionality, Intentionality and Meaning In Daniel M. Dubois (ed) Proceedings of CASYS�98, The Second International Conference on Computing Anticipatory Systems. (New York: Springer-Verlag, 1999). It can be found at http://www.nu.ac.za/undphil/collier/papers/casys98.pdf

It seems to me that we need a kind of mathematical theory of meaning. How can meaning be defined in the abstract, yet without giving it meaning with reference to any body of knowledge other than the abstract one which is contained, for example, in the mathematical theory of communication and its elaboration into non-linear dynamics? Let me make a first proposal: expected information can only be provided with meaning by information systems. I know that this circular definition begs the question, but it is only meant as a first step.

It is certainly not sufficient without being really question begging.  For one thing, complete formalization of meaning kills meaning. Meaning must be open-ended and fallible. For the argument for this see my
THE DYNAMICAL BASIS OF INFORMATION AND THE ORIGINS OF SEMIOSIS, in Edwina Taborsky (ed) Semiosis. Evolution. Energy Towards a Reconceptualization of the Sign. Aachen Shaker Verlag 1999 Bochum Publications in Semiotics New Series. Vol. 3 (1999): 111-136. It can be found at http://www.nu.ac.za/undphil/collier/papers/Dyninf3.pdf. The argument is basically Peircean, but with more modern formalism. If you understand semiotics, then you understand why a formal and complete theory of meaning is impossible. It is possible, however, to link up a formal but incomplete theory of information with a theory of meaning.

Sorry to harp here, but I keep feeling that we are going around in circles, repeating again and again issues that have either been solved or else have been shown to have no formal and complete solution. I mention my own papers mostly because I have taken a lot of trouble to review these issues. Everyone on the fis list should read Heinz von Foerster's 1960 paper 'On self-organizing systems and their environments'. There is a reprint in his collection Understanding Understanding. Forty five years later we shouldn't still be inventing the wheel this group seems to be spinning around on.

Grumpy, grumpy,

John



Professor John Collier                                     [email protected]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292       F: +27 (31) 260 3031
http://www.nu.ac.za/undphil/collier/index.html -------------------------------------------------------------------- Please find our disclaimer at http://www.ukzn.ac.za/disclaimer -------------------------------------------------------------------- <<<>>> Received on Mon Oct 3 09:44:25 2005


This archive was generated by hypermail 2.1.8 on Mon 03 Oct 2005 - 09:44:25 CEST