Re: [Fis] meaning of meaning: meaning of "information" and "entropy"

From: Dr. Shu-Kun Lin <[email protected]>
Date: Mon 09 Feb 2004 - 09:26:29 CET

Dear Pedro, Loet, and Soeren,

Let us get ride of the concept "negative entropy" (Schroedinger)
or "negentropy" (Brillouin) and use "information" only.

We are doing research. Doing research is date compression
(We discussed this before). One way of date compression
is to use minimal number of terms. I like Soeren's discussion
on the meaning of "information". Let us accept
Wiener's argument that "these two concepts are synonymous".
Soeren, do you have the exact citation of Wiener's words?

Loet, could you please tell me the literature about
your "probabilistic entropy"?

Thanks!
Shu-Kun

-- 
Dr. Shu-Kun Lin
Molecular Diversity Preservation International (MDPI)
Matthaeusstrasse 11, CH-4057 Basel, Switzerland
Tel. +41 61 683 7734 (office)
Tel. +41 79 322 3379 (handy)
Fax +41 61 302 8918
E-mail: lin@mdpi.org
http://www.mdpi.org/lin/
Soeren Brier wrote:
> Dear Pedro
> 
> The model of meaning you suggest here strikes a very crucial problem in the discussion of meaning in a scientific context. It seems to me that your suggested model is:
> 1.Purely functional
> 2. Cybernetic in it foundation.
> I therefore think that it has some of the shortcommings I  (Brier 1992, Cybernetics & Human Knowing 1:2/3) have analyzed in Gregory Bateson's definition of information as "a difference that makes a difference" in my paper "Information and Consciousness: A Critique of the Mechanistic Concept of Information" (see http://mail.cbs.dk/en/mail.html?sid=0rQrZh9uIVk&lang=en&cert=false). Bateson write :
> "In fact, what we mean by information - the elementary unit of information - is a difference which makes a difference,...." (Bateson 1973, p. 428).
> 
> Allow me to quote from my own paper 
> 
> "To Bateson mind is a cybernetic phenomenon, a sort of mental ecology. The mental relates to the ability to register differences. It is an intrinsic system property. The elementary, cybernetic system with its messages in circuits is in fact the simplest mental unit, even when the total system does not include living organisms. Every living system has the following characteristics which we are accustomed to call mental:
> 
> "1.The system shall operate with and upon differences.
> 
> 2.The system shall consist of closed loops or networks of pathways along which differences and transforms of differences shall be transmitted. (What is transmitted on a neuron is not an impulse, it is news of a difference).
> 
> 3. Many events within the system shall be energized by the respondent part rather than by impact from the triggering part.
> 
> 4. The system shall show self-correctiveness in the direction of homeostasis and/or in the direction of runaway. Self-correctiveness implies trial and error." (Bateson 1973 p. 458)
> 
> Mind is synonymous with the cybernetic system which is comprised of the total, self-correcting unit that prepares information. Mind is immanent in this wholeness. When Bateson says that mind is immanent he means that the mental is immanent in the entire system, in the complete message circuit." (Brier 1992, p.79)
> 
> The problem with Bateson and your suggestion, I think , is that it can only work in a Peircean framework or some other kind of objective idealism. Bateson do not want to separate mind and intellect and talks about the intelligence of emotions. But when you look into his philosophical frame work he never left Wiener's idea that. I quote my analysis again:
> 
> In "Mind and Nature" (1980 p.103) Bateson further develops his criteria for a cybernetic definition of mind:
> 
> "1.A mind is an aggregate of interacting parts or components.
> 
> 2.The interaction between parts of mind is triggered by difference, and difference is a nonsubstantial phenomenon not loca-ted in space or time; difference is related to negentropy and entropy rather than to energy.
> 
> 3.Mental process requires collateral ener-gy.
> 
> 4.Mental process requires circular (or more complex) chains of determination.
> 
> 5.In mental process, the effects of difference are to be regarded as transforms (i.e. coded versions) of events preceding them. The rules of such transformation must be comparatively stable (i.e. more stable than the content) but are themselves subject to transformation.
> 
> 6.The description and classification of these processes of transformation disclose a hierarchy of logical types immanent in the phenomena."
> 
> These criteria are all famous within cybernetic understanding of mind, and I will not discuss them further here. My critique is concentrated on the foundation of the second criteria: "difference is related to negentropy and entropy..."
> 
> It is problematic, that Bateson is following Norbert Wiener's idea of a basic community between thermodynamics and Shannon and Weaver's theory of information based upon a shared starting point in the concept of entropy.
> 
> Regarding the problem of the relation between the concept "information" and the concept "negative entropy". Bateson writes (Ruesch & Bateson 1968, p. 177):
> 
> "Wiener argued that these two concepts are synonymous; and this statement, in the opinion of the writers, marks the greatest single shift in human thinking since the days of Plato and Aristotle, because it unites the natural and the social sciences and finally resolves the problems of teleology and the body-mind dichotomy which Occidental thought has inherited from classical Athens".
> 
> However, Shannon's theory of information has never had anything to do with the semantic content of messages. Shannon and Weaver (1969 p.31-32 ) write:
> 
> "The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that they are selected from a set of possible messages."
> 
> So, what people and animals conceive as information is something quite different from what Shannon and Weaver's theory of information is about. As von Foerster (1980 p. 20-21) concludes:
> 
> "However, when we look more closely at these theories, it becomes transparently clear that they are not really concerned with information but rather with signals and the reliable transmission of signals over unreliable channels ..." " (Brier 1992, p. 82-83).
> 
> The cybernetic view point fits nicely into a neodarwinistic viewpoint of meaning for the organism as weel as the species is 'survival value'. As such the cybernetic functionalism and objective information concept fits nicely with neodarwinism.
> 
> But I am dissatisfied with the possiblity of defining life, mind, emotions and meaning on this foundation, because there is not a theory of qualia in this whole paradigm - a paradigm that is also prevailing in the attempt to make a 'science of consciousness'. My alternative is like Stan's and biosemioticians to use Peirce's semiotic philosophy as the fra,mework, because it includes a theory of qialia, mind, meaning and signification. What we discuss is how radical we want to interpret Peirce. Here Stan is the most radical in his pansemiotism, biosemiotic at the other side. But the major disagreement is between the cybernetic informationalist and the Peircian semioticians. Cybersemiotics is an attempt to unite the two paradigms in a broader framework integrating the core knowledge in the both.
> 
> But anyway this point connnects FIS to the whole philosophical problems of mind and matter, qualia and meaning and 'what is life'? - and I am not satisfied with Schr�dingers answer to that question.
> 
> 
>  
> 
> ----- Original Message -----
> From: "Pedro C. Mariju�n" <marijuan@unizar.es>
> Date: Friday, February 6, 2004 2:21 pm
> Subject: Re: [Fis] meaning of meaning
> 
> 
>>Dear colleagues,
>>
>>Thinking on the cellular foundations of 'meaning' does not 
>>necessarily mean 
>>that there is a stumbling block of complexity that at the time 
>>being cannot 
>>be crossed. Even if the two extremes ---molecular details awfully 
>>networked 
>>in every direction, and the absence of a viable 'dynamic' whole 
>>scheme for 
>>the cell-- are rather obscure yet, there are intermediate 
>>territories where 
>>one can point to some cellular 'doctrine' of meaning elaboration 
>>by the cell.
>>
>>Years ago I pointed out that filling-in the occurring 'functional 
>>voids' 
>>provoked by the incoming signal, was the central response of the 
>>cellular 
>>productive machinery, involving both synthesis of proteins and 
>>their 
>>degradation. However, putting this approach in some formal track 
>>('informationally') was not easy at all. I much thank ideas 
>>received from 
>>Shu-Kun's papers about mol. recog. & entropy of mixing, and also a 
>>very 
>>elegant message I could not answer from Loet (fis: 20.12.03. I 
>>have pasted 
>>it below) so that a conceptual road  to approach cellular 
>>processing of 
>>signals looms.
>>
>>The whole speculation would combine Robert Rosen's dynamic scheme 
>>on 
>>'forcing' and enzyme networks, with SK approach, Loet's and my 
>>functional 
>>voids idea (handled by a population of enzymes working as 
>>'molecular 
>>autiomata'). It looks too heterogeneous, but at least one gets a 
>>formal 
>>idea on how COMMUNICATION with the environment relates to changes 
>>in the 
>>advancement of a cycle, MEANING thus, derived of the entrenchment 
>>between 
>>GENERATIVE and STRUCTURAL forms of information whose unending 
>>processes 
>>weave and unweave the fabric of life.
>>
>>A big question would be whether philosophies germane to the above 
>>could be 
>>applied to 'meaning' in other informational realms. Even to 
>>establish a 
>>natural background on mentality and the conceptual realm.
>>
>>best
>>
>>Pedro
>>-------------------------------------------------------------------
>>--------- 
>>
>>
>>Dear Pedro and colleagues,
>>I take the liberty to fight this argument all the way since I wish 
>>it to be 
>>correctly understood. I agree with much of what you say and I 
>>appreciate 
>>the example of the enzyme regulating the flux efficiently. 
>>However, I think 
>>that it is important not only to distinguish between the flux of 
>>resources 
>>and the entropy that it generates, but also between the 
>>thermodynamic and 
>>probabilistic entropy that is generated by this flux. Thus, there 
>>would be 
>>three systems of reference (theories) in the case of your example.
>>The distinction is important because we are interested in 
>>information 
>>theory. Historically, authors from the side of physics and 
>>chemistry have 
>>attempted to subsume probabilistic entropy under thermodynamic 
>>entropy as a 
>>special case or a little fraction that can be fully understood 
>>using 
>>(chemical) physics. The argument becomes then often unclear 
>>because the 
>>authors do not specify the system of reference other than as "the" 
>>system. 
>>With "the" system, of course, they mean "nature". (One can show 
>>that the 
>>probabilistic entropy is smaller than the thermodynamic entropy 
>>times the 
>>Boltzmann-contant.)
>>Information theory however provides us with an entropy calculus 
>>that is 
>>first independent of the system of reference. Therefore we can 
>>also study 
>>the probabilistic entropy in an economic system. Whenever 
>>something is 
>>communicated a probabilistic entropy is generated by this 
>>redistribution. 
>>For example, in this exchange of emails we can count the threads, 
>>the 
>>mails, the words, etc., and compute in each dimension how much 
>>(probabilistic) entropy is generated. This is straightforward and 
>>it does 
>>not have anything to do with the thermodynamic entropy produced by 
>>or the 
>>energy needed for all the systems which carry the exchange. (One 
>>can 
>>compute a thermodynamic entropy of the exchange of message, but 
>>that would 
>>not inform us at all about the exchange.)
>>I am sure that you are able to elaborate this for your example of 
>>the 
>>enzyme. The question one has to raise first when one studies the 
>>probabilistic entropy of a system is: what does the system 
>>redistribute 
>>when it communicates? This provides us with the specification of 
>>an 
>>hypothesis. (George Spencer Brown would call this an observation = 
>>a 
>>distinction + an identification, but that may be confusing.) 
>>Second, one 
>>can ask how one can indicate the communication. This provides us 
>>with an 
>>operationalization. Thirdly, the measurement can inform us about 
>>the 
>>relative quality of the hypothesis. A system which operates in 
>>terms of 
>>energy redistribution can then be considered as a special case 
>>that 
>>requires a special theory (e.g., physics). But the one system 
>>cannot 
>>reduced to the other without a specific theory (that can be tested!).
>>
>>Loet
>>
>>Loet Leydesdorff
>>Amsterdam School of Communications Research (ASCoR)
>>Kloveniersburgwal 48, 1012 CX Amsterdam
>>Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
>>[email protected] ; http://www.leydesdorff.net/
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Mon Feb 9 09:31:21 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET