Fwd: Re: [Fis] Re: What is the definition of information ?

Fwd: Re: [Fis] Re: What is the definition of information ?

From: John Collier <collierj@ukzn.ac.za>
Date: Fri 02 Sep 2005 - 21:18:05 CEST

Iam sending this to the whole list, since Ithink the current discussion is losing perspective, and is starting to go in circles. History is important.

John

Professor John Collier
Philosophy, University of KwaZulu-Natal
Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
email: collierj@ukzn.ac.za
http://ukzn.ac.za/undphil/collier
--------------------------------------------------------------------
Please find our disclaimer at http://www.ukzn.ac.za/disclaimer
--------------------------------------------------------------------
<<<<gwavasig>>>>

attached mail follows:


There is no universally accepted notion of information. The commonsense notion is highly equivocal, and there
is probably no scientific notion that corresponds to it. Two general definitions of information that are very general and virtually equivalent are found in the following:

Information is commonly understood as knowledge or facts acquired or derived from, e.g., study, instruction or observation (Macmillan Contemporary Dictionary, 1979). On this notion, information is presumed to be both meaningful and veridical, and to have some appropriate connection to its object; it is concerned with representations and symbols in the most general sense (MacKay 1969). Information might be misleading, but it can never be false. Deliberately misleading data is misinformation. The scientific notion of information abstracts from the representational idea, and includes anything that could potentially serve as a source of information. The most fundamental notion of information, attributed to a number of different authors, is "a distinction that makes a difference" (MacKay 1969), or "a differnece that makes a difference" (Bateson 1973: 428). Information theory, then, is fundamentally the rigorous study of distinctions and their relations, inasmuch as they make a difference.
http://www.ukzn.ac.za/undphil/collier/information/information.html

MacKay, Donald M., Information, Mechanism and Meaning. Cambridge, MA: MIT Press, 1969.
Bateson, G. (1973), Steps to an Ecology of Mind (Paladin. Frogmore, St. Albans).

The above definitions are in line with your remarks. Shannon's approach does not define information, but the information capacity of a channel. A different way to define information that does deal with specific amountsd of information is algorithmic complexity theory:

# Chaitin, Gregory J., Randomness and Mathematical Proof. Scientific American 232, No. 5 (May 1975): 47-52. http://www.cs.auckland.ac.nz/CDMTCS/chaitin/sciamer.html
# Chaitin, Gregory J., Algorithmic Information Theory. Cambridge: Cambridge University Press, 1987.
# Chaitin, Gregory J., Information, Randomness & Incompleteness. Singapore: World Scientific,1987.
# Chaitin, Gregory J., The Limits of Mathematics. New York: Springer-Verlag, 1998. http://www.cs.auckland.ac.nz/CDMTCS/chaitin/rov.html
# Chaitin, Gregory J., Randomness & Complexity in Pure Mathematics, International Journal of Bifurcation and Chaos 4 (1994): 3-15 http://www.cs.auckland.ac.nz/CDMTCS/chaitin/ijbc.html
# Kolmogorov, A.N., Three Approaches to the Quantitative Definition of Information. Problems of Information Transmission1 (1965): 1-7.
# Kolmogorov, A.N., Logical Basis for Information Theory and Probability Theory, IEEE Transactions on Information Theory14 (1968): 662-664.

In neither the Shannon nor the algorithmic case is meaning relevant to information. If you want to extend the theory to include semantic values, I would suggest that you consider information as a sign along the lines of Peircean semiotics. In this case a difference in information is a difference in a sign that makes a difference to its meaning.

I hope this helps.

John

Professor John Collier
Philosophy, University of KwaZulu-Natal
Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
email: collierj@ukzn.ac.za
http://ukzn.ac.za/undphil/collier
>>> <Dupagement@aol.com> 08/29/05 2:24 PM >>>
I am searching for a critique on the definition of the term: "Information".
One definition, I am using is : Information is a thing which has the only
property of conveying meaning. Thus a newspaper is not information since it
also has the property of being made up of paper, it has certain dimensions, it
has print ink and so on. It carries information. Shannon's theory basis its
definition of information on probability theory. For a thing to be more
probable compared to another it has to be different. This differentness means
"different". For one unit to be recognizable from another it has to have some
differentness.
Can I get a critque of this? Also citations and directions to bibliography
will be welcomed.
Syed Ali

--------------------------------------------------------------------
Please find our disclaimer at http://www.ukzn.ac.za/disclaimer
--------------------------------------------------------------------
<<<<gwavasig>>>>

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Fri Sep 2 21:18:47 2005


This archive was generated by hypermail 2.1.8 on Fri 02 Sep 2005 - 21:18:47 CEST