[Fis] Re: What is the definition of information?

[Fis] Re: What is the definition of information?

From: Shu-Kun Lin <[email protected]>
Date: Mon 12 Sep 2005 - 15:46:59 CEST

Just finished reading the messages of this thread, particularly
the messages from sbr.lpf@cbs.dk on 4 September 2005,
from collierj@ukzn.ac.za on 2 September 2005, and
from dbar_x@cybermesa.com on 2 September 2005. Several
known definitions were discussed in these messages.

To narrow our definition, we may accept that

1. "Information (I) is a physical representation" (Rolf Landauer, cited
by Michael Devereux.

2. Information has a quantitative measure. It has a unit in bit.
Of course it is measurable.

3. Higher amount of information is related to higher value of
--difference (this is my understanding of "a difference that makes difference".
--reduction of uncertainty (OK with me)
--complexity (Gregory J. Chaitin's examples. Need more comments)
--data after processing or refining (Marcin J. Schroeder citations)

These all conform with my definition that "information is the
compressed data". Data processing or refining should be taken
as data compression.

The best possible data compression has the similar meaning
as Gregory J. Chaitin's "minimal program"
(http://www.cs.auckland.ac.nz/CDMTCS/chaitin/sciamer.html).

What is data L (or raw data) measured in bits? It is a
property of a structure. What is a structure? A formatted
harddisk in use has certain structure and has a raw data of L bits.
Many examples of mechanics or thermodynamics can be given.

The raw data L is conserved for many systems and "three laws"
similar to the laws expressed as energy and entropy in
thermodynamics can be given.

Best regards,
Shu-Kun

Søren Brier wrote:
> Dear Marcin J. Schroeder
>
> It is indeed a pity you could not come to Paris. I hope you can come to Salzburg next year, maybe we can come to Japan some year?
>
> I am very happy about your acknowledgement of the necessity of a deep philosophical approach, and I appreciate you use of Advaita Vedanta and Buddhism plus phenomenology.
>
> I also agree in the understanding of the many types of definition of entropy based information as tools for different purposes. We should write them down and classify them and their subject area.
>
> But running though you FIS paper (I could not get to the one in Entropy) you only slightly touch upon a theory of consciousness, which I agree with you is necessary, but hardly on a theory of meaning, which I think is absolutely essential. This is why I have studied Peirce's semiotics for many years now.
>
> Discussing information we need both an ontology and an epistemology, and we needs concepts of truth and meaning, as well as of quantity and quality, and most of all how we are going to relate them to each other. My strategy has been to give the quantitative aspect to information science (with a base in thermodynamics) and the qualitative to semiotics (with a base in Peirce's philosophy of qualia and feeling).
>
> Cheers
>
> Søren Brier
>
> -----Oprindelig meddelelse-----
> Fra: fis-bounces@listas.unizar.es [mailto:fis-bounces@listas.unizar.es] P� vegne af mjs
> Sendt: 2. september 2005 14:25
> Til: fis@listas.unizar.es
> Emne: Re:[Fis] Re: What is the definition of information ?
>
> Dear FIS Friends,
> I am delighted to observe the development of the discussion
> of the topic belonging to the very center of my research
> interests "What is information?". Sayd, thank you very much
> for revisiting this important topic.
>
> Due to sudden illness I have missed an opportunity to
> present my views on exactly the same topic during FIS 2005
> in Paris. I had to cancel my presentation, and you can
> imagine how I regret now that I did not have chance to
> present my own definition of information and its
> philosophical background. I have to be content that the
> paper which I was going to present ("Philosophical
> Foundations for the Concept of Information: Selective and
> Structural Information") is available in the Proceedings of
> FIS2005 (http://www.mdpi.org/fis2005/F.58.paper.pdf).
>
> However, in this message I do not want to talk about my
> definition (although, I would be very happy if you decide to
> take a look at it in my paper for the conference). I would
> like to make suggestions regarding the discussion and to add
> my two pennies.
>
> I have already benefited from reading the thread of messages
> and I am grateful for some interesting ideas. However, I
> believe that I could benefit more, if I could be sure that I
> understand well these ideas, but frequently I feel lost.
> I am afraid, there is a lot of confusion in what exactly we
> want to achieve. Let me elaborate on the problems which I
> have encountered.
>
> 1. The initial question already calls for some
> clarification:"What is THE definition of information?"
> It looks like it automatically assumes that there is
> one "good", "proper", or "best" definition - "the definition
> of information". However, there is no reason to expect that
> there is such a unique definition. Since I myself proposed a
> definition which differs essentially from other definitions,
> apparently I have had some reason to believe that my
> definition is better than those published before.
> I agree that there are some criteria which can be used to
> compare definitions of information, but I do not believe
> that there is possible unique answer to the question "What
> is information?" without any reference to pragmatics, for
> instance to the context in which this concept is going to be
> used. After all, we define concepts to explain what we mean
> by them. When somebody gives me logically correct definition
> of some concept, I cannot say "It is wrong", but only
> that "I am not interested in this concept", or that "I do
> not believe this definition can be applied to what we agreed
> is denotation of the concept."
> There are some cases when the definition can be criticized
> for incompatibility with commonly agreed hierarchy of
> generality of concepts. This may sound incomprehensible, so
> I owe you some explanation.
>
> 2. I assume that we are looking for the definition of the
> CONCEPT of information. Thus, we are not talking about the
> related but different issue of operational definition of
> some magnitude, about the question how to measure something.
> Simple example: It would be an error to say that temperature
> IS the position of the pillaret of mercury in a thermometer.
> Yes, we measure temperature by measuring the position of the
> top of the pillaret, but without a theory of the thermal
> expansion of liquids, this measurement is irrelevant for the
> question what is temperature. Only after we have definition
> of the concept of temperature, we can say that the
> thermometer measures temperature. Thus, I have some
> objections to Stan's definition "any constraint on entropy
> production". Yes, I know he qualified his definition
> as "functional" (probably similar to what I call
> "operational"), but later he writes "My (2) above can be
> associated with meanings". I do not understand it.
> If we want to define the concept (to present its meaning),
> we have to put this concept in the framework of other
> concepts which are already defined (agreed), or which are
> considered primitive concepts and which are characterized
> axiomatically (as for instance the concept of "set"). The
> latter case is when for given concept there is no more
> general concept.
> I do not believe that the concept of information is so
> general, so at the moment I will not discuss the possibility
> of considering it a primitive concept.
>
> 3. I would like to address also a delicate issue of the
> logical correctness. Many of you can react that it is
> wasting of time for trivialities, but I believe it is not.
> In my extensive collection of various published attempts to
> define information, the majority of entries are simply
> incorrect.
> Sometimes they have been formulated in an incorrect manner
> (possibly) intentionally to express desperation on the
> difficulty of the attempt to define such a general concept.
> A good example is Searl's definition which could be used in
> an introductory logic textbook as an instance of an error
> of "circular definition" (hereafter no reference means that
> references can be found in my paper mentioned
> above): "Information is anything that we can count or use as
> information."
> This brings us to the first rule of logically correct
> definition:
>
> In the definiens (that which defines) no reference can be
> made to the definiendum (that which is defined).
>
> 4. The rule should be expanded to the cases where the
> reference is indirect, it means when the concept in the
> definiens is defined somwhere else using definiendum. This
> can happen when we put together statements coming from
> different publications. Floridi gives the following example
> (Entropy 2003, 5, 125-145) of excerpts from different
> publications which can easily lead to a "viscious circle":
> "Information is data that has been processed into a form
> that is meaningful to the recipent. [Davis & Olson 1985]
> Data is the raw material that is processed and refined to
> generate information. [Silver & Silver 1989)
> Information equals data plus meaning. Checkland & Scholes,
> 1990]
> Information is data that have been interpreted and
> understood by the recipient of the message.
> Data will need to be interpreted or manipulated [to] become
> information. [Warner, 1996]"
>
> In our discussion Soren mentioned in response to Shu-Kun's
> definition "Information is the amount of data compressed"
> that the definition requires a definition of data. I agree
> that it is a very important point as frequently data are (I
> am writing "are" as I am attached to the old good style of
> the singular form of this noun "datum") defined as a special
> type of information. If we define data as something (WHAT?)
> that can be recorded in a newspaper, book, magnetic memory,
> etc., the definition becomes very, very narrow.
>
> However, I have additional objection to Shu-Kun's
> definition. I do not understand in what sense information is
> an AMOUNT of something. So, is the ontological status of
> information secondary to the status of data? Information
> does not exist, but is only a number which describes how
> much data have been compressed, or as I suspect was Shu-
> kun's intention, to what amount the data have been
> compressed? It seems to me very difficult to understand,
> even harder to accept.
>
> 5. There are many examples of confusion caused by the
> problem with the distinction of quantitative characteristics
> of the concept of information (measures of magnitudes) and
> the concept itself. Similar confusion can be found when
> someone identifies information with entropy. There are many
> possible ways to characterize quantitatively information.
> Tsallis (Entropic Nonextensivity: A possible Measure of
> Complexity) writes about twenty different generalizations of
> entropy. I could add a half of dozen multi-parameter
> generalizations of entropy developed in the seventies and
> eighties. This means more than two dozens of families of
> entropies, each with infinite, uncountable number of
> members. And each author of a new generalization has a good
> reasons to claim the importance of its creation.
>
> On the other hand, I believe that entropy does not measure
> information, but the capacity of the system to
> receive/increase information, and that for information in a
> system we should use an alternative measure; the value of
> this measure can be associated with familiar magnitude of
> relative entropy or Kullbach-Leibler distance from the
> uniform distribution, but it can be introduced independently
> from any other concepts (see Entropy 2004, 6, 388-412 at
> http://www.mdpi.org/entropy/papers/e6050388.pdf/). For our
> present discussion it is important that there are many
> different categories of measures of information, each with
> infinite number of members.
> Thus, even if we ignore all logical and ontological reasons,
> defining information in terms of entropy or any other
> measure leads to the question what is the difference between
> information defined in terms of different measures?
>
> 6. Lets look now at the most popular "definition" of
> information, which although not introduced directly in the
> famous book of Shannon and Weaver, definitely has been
> derived from the correspondence of entropy and uncertainty
> described there:
> "Information is reduction/resolution of uncertainty."
> There are two generic meanings of the word "uncertainty"
> known to me. One is the "lack of information", the other is
> a psychological state of distress caused by lack of
> information. In either case the definition clearly violates
> the first rule that the definiens MUST NOT refer to the
> definiendum.
> Yes, the definition can be saved by considering uncertainty
> as a psychological state described in behavioral terms, but
> then its denotation becomes very narrow.
> As a diggression, it is interesting that this
> famous "definition" is most frequently used by those who
> following Shannon declare absolute disinterest in the
> meaning of information, as it is not a subject of concerns
> for engineers. But it is quite clear that the psychological
> interpretation cannot be applied in this case.
> Imagine a young man who proposes to young lady by e-mail:
> "Would you marry me?" His level of uncertainty is very high.
> He gets response "Gwulp brdean hrwut." The message is
> definitely carrying some information (we can count its
> entropy, so we can measure the amount of information
> received.) Do you think young man's uncertainty reduces?
> Coming back to the "definition" using the concept of
> uncertainty in the definiens. Someone can object that this
> word has different, non-generic meaning. Then it has to be
> provided, and in its defition no reference to information
> can be made. I wish good luck those who want to pursue this
> way of defining information, but I have confes my scepticism
> about the success.
> But with this we arrive to another logical issue.
>
> 7. Although for stylistic reasons the second rule (below) is
> not always applied in the strict form, I would expect that
> logically correct definition should have the classical genus-
> differentia form, or at least should be equivalent to the
> definition in such a form:
> "A is B, such that C"
> where A is definiendum (what is defined), B is a genus of A,
> i.e. the concept which is more general than A defined
> earlier, or already known, and C stands not for the concept
> but for all statement how A differes from all other species
> of B (i.e. all other concepts less general than B).
> For instance in "Man is a two-leg animal without feather"
> A= "man"
> B= "two-leg animal"
> C= "without feather"
> This is this famous example of the definition given to
> students by Aristotle teaching in his Lyceum how to define
> concepts. It has been made so famous by the student who next
> day brought to Lyceum a chicken shaved of its feather.
>
> Isn't it all that too obvious to be brought to your
> attention? Believe me not. We are just arriving to the
> nucleus of the problem in defining information.
>
> 8. The classical genus-differentia definition places the
> concept defined in the framework of other concepts in a very
> specific way. The necessary (and usually difficult) step is
> to find a genus for the concept, and the more general
> concept has to be defined, the more difficult is the task to
> find a genus for it.
> This is the reason why so many definitions of information,
> logically correct, are not very convincing.
> Consider the definition (not entirely logically correct)
> from the influential book written by Fred Dretske "Knowledge
> and the Flow of Information":
> "Roughly speaking, information is that commodity capable of
> yielding knowledge, and what information or signal carries
> is what we learn from it."
> Poor Dretske must have spent lots of time thinking what to
> write before he put on paper the word "commodity." The first
> reason why this definition is not acceptable to me is this
> bizarre choice of the genus for information. I do not say
> that Dretske could not use the concept of commodity, but it
> just makes his definition not interesting.
> But there is another reason why many perfectly correct
> definitions lose my interest, and why I am not interested in
> pursuing Dretske's line of reasoning about the meaning of
> the concept of information. Dretske is using in his
> definiens the concept of knowledge, as if knowledge was a
> primary concept with respect to information. This makes his
> definition as narrow, as the psychological version
> of "uncertainty definition". I do not believe that knowledge
> can be defined in without any reference to information. Thus
> Dretske's definition is doomed to become circular, or
> trivial.
>
> 9. For me the concept of information is so intersting
> because it describes an entity which can be found in many
> domains of our inquiry. Thus all definitions which restrict
> the domain of denotation to only few instances of
> information, no matter how logically correct, are just not
> intertesting.
> I need a tool of big power, a definition which refers to all
> phenomena identified as involving information in
> communication, thermodynamics, computer science, genetics,
> cognitive science, etc.
> If the definition directly refers to very narrow domain, as
> for instance when information is restricted to psychology,
> it is just not very interesting. For me it is not much
> better than using what Webster Dictionary can tell us about
> information.
> This applies to some cases where the concept considered is
> of independent interest. For instance Fisher's information
> is for me interesting as a potential special instance of
> more general concept. Actually, Fisher never defined clearly
> the concept of information. He only considered a measure of
> information in a sample, quite likely understood by him as a
> source for knowledge, about the value of the parameter in
> the population provided the first moment exists. Thus, it is
> difficult to place Fisher's information among the
> definitions of the concept of information, even if the work
> of Fisher is interesting for itself.
>
> 10. In this discussion Bateson's definition of information
> as "a difference that makes difference" has been recalled.
> It achieved a status of a celebrity, right next
> to "uncertainty definition". It is charmful, seducing. I can
> imagine that it can have special appeal to those who like
> esoteric style of conversation. Little jewel, like zen koan
> or haiku. I can imagine exalted pseudo-intellectuals
> saying "I tells me so much about information..."
> But actually it tells us nothing. Its popularity is based on
> open-endedness. You can put into it whatever you want, and
> it looks like it works. Actually it doesn't.
> There is a definite difference between beer and water. It
> makes big difference to me whether waiter brings me one or
> the other to the table. But it is not information.
> More seriously. Bateson "definition"'s appeal comes from:
> - logically incorrect use of a relative noun without its
> referents. Difference between what? Whatever? I remember how
> much I enjoyed elementary school jokes such as "The bird has
> one leg from the other". I could laugh many times from the
> same sentence tasting its absurdity. But now, I am an adult,
> and after grading thousands of homework assignments of my
> students with similar curiosial sentences written without
> any intention to entrtain me, it is not funny anymore.
> - Bateson is using an idiom in English "to make a
> difference". Try to translate his definition into another
> language and you will see how much it loses in its appeal.
> In my mother tongue (Polish) it becomes dull and shallow.
>
> To give full justice to Bateson's "definition" it has one
> good aspect. It refers to the concept of "difference" which
> points to the concept of variety. And this way leads to
> philosophical foundations which are in my opinion
> appropriate for the concept of information.
> The issue of such philosophical foundations is of great
> importance for interesting definition of information.
>
> 11. Why so many attempts to define information seem
> questionable?
> Let me explain my opinion. In many attempts the concept of
> information was being placed through the definition in the
> framework of other concepts which were supposed to be
> simple, easy, intuitive. Especially last qualification is
> dangerous. It is illusion that the definition referring to
> intuitively obvious terms is the best, easiest to
> understand. Only someone very naive can believe that the
> concept of such great generality as information can be put
> into a framework of simple, intuitive, daily-life concepts.
> To have interesting general concept of information we should
> look for a framework of extensive, rich philosophical
> tradition. There were some attempts to do it. All direction
> of defining the concept of information in terms of
> the "form" was based on this assumption, although in some
> cases the motivation was simply ethymology of the
> word "information". It has source in an old idea in
> Aristotelean tradition of the concept of substance. Form
> informs matter, matter materializes form to become
> substance.
> I do not want to go too far in talking about this direction
> (see for instance P. Young The Nature of Information, 1987),
> but for the reasons which I explained in my FIS 2005 paper,
> I believe another approach is better.
> For this discussion, I would like to express my deep
> conviction that the general concept of information requires
> for its foundations an appropriate rich philosophical
> tradition with its developed conceptual framework.
> I myself have selected probably the oldest philosophical
> tradition of Europe and Asia of the study of one-many
> relation. Someone not satisfied with my definition can look
> for different foundations and their different use. But I
> believe that as long as information is defined in terms
> of "uncertainty" or similar concepts which are just simple,
> easy, obvious, the outcomes are not likely to be
> interesting.
> Marcin
>
> Marcin J. Schroeder, Ph.D.
> Dean of Academic Affairs and
> Professor
> Akita International University
> mjs@aiu.ac.jp
>
>
>
> _______________________________________________
> fis mailing list
> fis@listas.unizar.es
> http://webmail.unizar.es/mailman/listinfo/fis
>
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Sep 13 12:09:04 2005


This archive was generated by hypermail 2.1.8 on Tue 13 Sep 2005 - 12:09:04 CEST