Re: information

From: mark burgin <[email protected]>
Date: Fri 21 Jun 2002 - 22:46:51 CEST

Rafael Capurro wrote:

> Dear Mark Burgin,
>
> >With a great interest I have read the Trialogue "Is a unified theory of
> information feasible?" Your lemma is especially interesting to me because
> I also research these problems...>
>
> this is fine, thanks. I will take a look at
> your paper. I believe, indeed, that we can
> understand information as a second order
> category (not as quality of things, but a
> quality we (!) adscribe to relationships between
> things) in the sense of *selection* that takes
> place when systems interact and choose
> from what is being offered (Luhmann). In this sense the
> classical concept of 'cause' (in a deterministic
> sense) is a special *effect* of *informational
> causation* in the sense of *to give rise to*. This is
> evident in the case of the *interpretation* of
> a text (with different possibilities of understanding
> its meaning) but it seems to be the case also
> in non-human levels of reality that were supposed
> (since modernity) to be only (!) related within
> a deterministic (clockwork-like) manner. This is
> what Hofkirchner and Fleissner call *actio non
> est reactio*.
> Birger Hjoerland (a Danish colleague) and myself
> suggest this view in a state-of-the-art
> "The Concept of Information" that will be published
> this year in the Annual Review of Information Science
> and Technology (ARIST).
> We should not forget that the *advancement
> of science* is sometimes due to the application of
> metaphors i.e. of similar concepts from other fields,
> using sometimes the same token. In this sense I
> advocate for a network of *familiy ressemblances* with regard
> to the uses of the information concept in different
> areas. This may sometimes not be necessary or useful
> (if we think, for instance, about concepts such as
> *mass*, *work*, *energy* etc.), but the plasticity of
> language allows us to make differences as well as to
> surpass them, when we think this may open new insights.
>
> kind regards
>
> Rafael

Dear Rafael,
Thank you for your very informative responce to my e-mail. It would be
interesting to read your paper if you can send it to me. Although, I have
not read your paper, below are some comments on your ideas.

Sincerely,
   Mark

> I believe, indeed, that we can understand information as a second order
category (not as quality of things, but a
> quality we (!) adscribe to relationships between things) in the sense of
*selection* that takes place when systems interact and choose
> from what is being offered (Luhmann).

Your "quality we ascribe" is actually generalized in the general theory of
information by "quality for a system" or "quality with respect to a system."
This makes the concept of information more objective and flexible. Here a
system may be not only a human being and not only a living being, but any
system that has a corresponding infological system. For example you can
easily imagine "information for a computer."

> In this sense the classical concept of 'cause' (in a deterministic sense)
is a special *effect* of *informational
> causation* in the sense of *to give rise to*.

Yes, you are right. According to the general theory of information,
information is the cause of changes.

> This is evident in the case of the *interpretation* of a text (with
different possibilities of understanding its meaning) but it seems to be the
case also
> in non-human levels of reality that were supposed (since modernity) to be
only (!) related within a deterministic (clockwork-like) manner. This is
> what Hofkirchner and Fleissner call *actio non est reactio*.

This is grounded and explained in more detail in the general theory of
information.

> Birger Hjoerland (a Danish colleague) and myself suggest this view in a
state-of-the-art "The Concept of Information" that will be published
> this year in the Annual Review of Information Science and Technology
(ARIST).

Hopefully, you'll reflect in your paper not only definitions, but also
different theories of information (statistical, qualitative, semantical,
dynamical, algorithmical, etc.), which are more importyant than pure
definitions. For example, we know what is electro-magnetic field mostly
from the theory.

> We should not forget that the *advancement of science* is sometimes due to
the application of metaphors i.e. of similar concepts from other fields,
> using sometimes the same token. In this sense I advocate for a network of
*familiy ressemblances* with regard
> to the uses of the information concept in different areas. This may
sometimes not be necessary or useful (if we think, for instance, about
concepts > such as *mass*, *work*, *energy* etc.), but the plasticity of
language allows us to make differences as well as to
> surpass them, when we think this may open new insights.

As I wrote before, information is not only a "family of resemblamces," but
even more: it is a parametric family with a systemic parameter, which is
called infological system. Besides, I don't think that understanding what
information is is a linguistic problem. Information, like substance or
energy, exists in the world and we need theories to explicate properties and
essence of this phenomenon.
Received on Fri Jun 21 22:47:54 2002

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET