Re: Replies to Juan, and John & Norbert

From: John Collier <[email protected]>
Date: Mon 03 Jun 2002 - 17:08:26 CEST
james a barham wrote:
John and Norbert:

Certainly, it is no accident that the Shannon equation takes the same
form as the Boltzmann equation. After all, one might say that what
Shannon was doing was analyzing the statistical mechanics of
communication channels. But that does not mean that information and
thermodynamic entropy are the same thing.  After all, it is no accident
that the equations describing electromagnetic and mechanical waves have
the same form, either. They are both WAVES.

But just as light and water differ in many other respects, so too do
information and thermodynamic entropy.  The main way they differ is that
information is intrinsically MEANINGFUL for some cognitive agent. To use
the word "information" in a way that neglects its meaning is to misuse
words, in my view, and to sow confusion.
So you say. I disagree. The Merriam-Webster dictionary, not a specially erudite source, gives the following definitions:

Main Entry: in·for·ma·tion
Pronunciation: "in-f&r-'mA-sh&n
Function: noun
Date: 14th century
1 : the communication or reception of knowledge or intelligence
2 a (1) : knowledge obtained from investigation, study, or instruction (2) : INTELLIGENCE, NEWS (3) : FACTS, DATA b : the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects c (1) : a signal or character (as in a communication system or computer) representing data (2) : something (as a message, experimental data, or a picture) which justifies change in a construct (as a plan or theory) that represents physical or mental experience or another construct d : a quantitative measure of the content of information; specifically : a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed
3 : the act of informing against a person
4 : a formal accusation of a crime made by a prosecuting officer as distinguished from an indictment presented by a grand jury
- in·for·ma·tion·al /-shn&l, -sh&-n&l/ adjective
- in·for·ma·tion·al·ly adverb

2, b does not fit your definition. It seems we are talking about different things that go by the same name. However, I can talk about yours, but you cannot talk about mine, since yours is a special case of mine and not vice versa. I must say that I do not take kindly to being ruled out of order by fiat, especially when it seems to go against at least one common usage. In any case, 2,b is the sort of information I was talking about in my paper for this conference.

Meaning is the heart of the matter. Or, more precisely, the VALUE that
structures have for cognitive agents in light of their goals is the
essence of information. So, value is the heart of the matter.
Naturalizing value---explaining how purpose and meaning can come into
existence out of a physical universe that is just minimizing
energy---that is the crux of our problem.  And conflating information
and thermodynamic entropy merely sweeps this crucial problem under the
rug. It does nothing to help solve it.
  
Again, I must disagree. Finding the relation between your usage of information and mine is the only way to solve the problem. Mine is connected to thermodynamic entropy by the negentropy principle of information. Yours, as far as I can see, is grounded in meaning, and cannot be explained without explaining meaning, which requires explaining how type 2,b information can acquire meaning. So, you have to, in the end, explain teh relation of meaning to negentropy, and thus to entropy.

I would agree that conflating your notion of information with entropy would be a big mistake. Don't do it.

John

Received on Mon Jun 3 17:09:52 2002

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET