So you say. I disagree. The Merriam-Webster dictionary, not a specially erudite source, gives the following definitions:John and Norbert: Certainly, it is no accident that the Shannon equation takes the same form as the Boltzmann equation. After all, one might say that what Shannon was doing was analyzing the statistical mechanics of communication channels. But that does not mean that information and thermodynamic entropy are the same thing. After all, it is no accident that the equations describing electromagnetic and mechanical waves have the same form, either. They are both WAVES. But just as light and water differ in many other respects, so too do information and thermodynamic entropy. The main way they differ is that information is intrinsically MEANINGFUL for some cognitive agent. To use the word "information" in a way that neglects its meaning is to misuse words, in my view, and to sow confusion.
Again, I must disagree. Finding the relation between your usage of information and mine is the only way to solve the problem. Mine is connected to thermodynamic entropy by the negentropy principle of information. Yours, as far as I can see, is grounded in meaning, and cannot be explained without explaining meaning, which requires explaining how type 2,b information can acquire meaning. So, you have to, in the end, explain teh relation of meaning to negentropy, and thus to entropy.Meaning is the heart of the matter. Or, more precisely, the VALUE that structures have for cognitive agents in light of their goals is the essence of information. So, value is the heart of the matter. Naturalizing value---explaining how purpose and meaning can come into existence out of a physical universe that is just minimizing energy---that is the crux of our problem. And conflating information and thermodynamic entropy merely sweeps this crucial problem under the rug. It does nothing to help solve it.
This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET