Re: [Fis] Entropy and information

From: Igor Rojdestvenski <[email protected]>
Date: Tue 06 Apr 2004 - 18:12:08 CEST

Dear FISers,

Sorry for my quite sporadic participation, mainly due to extensive traveling
throuhg the last year. As a physicist, however, I cannot abstain from taking
part in the discussion regarding information and entropy. The following is a
set of somewhat disjoint statements which I humbly offer for discussion in
the context of the current topic.

Linguistic discourse

1. In-formation as a word refers to assigning or creating a form (shape).
Thus, Information concerns a violation of uniformity (uni-formity,
shapelesness) of the system. An absolutely uniorm space-time does not allow
any means of measuring distance (coordinates) or time (intervals). Hence, in
such uniform space-time space and time vanish.

Physical discourse

2. Any information is necessarily connected with interaction. Th
interactions could be divided into two main types.
3. The first, termed here dynamic (D) interactions cause non-uniformity of
the system. An example of such interaction is ordering of spins in the
magnetic crystals when the magnetic field is applied. In this case the
created non-uniformity is anisotropy, an existence of preferred direction.
4. The second, termed thermodynamic (T) interactions, on the contrary,
increase the uniformity of the system. An example may be an interaction of
two objects of different temperature put in direct contact so that, in the
course of time, their temperatures equilibrate.
5. D-interactions create (convey) information. As the entropy is a measure
of the "uniformness" of the system, such interactions consequently decrease
entropy.
6. T-interactions create (increase) entropy, increasing uniformness of the
system. Thus, the system of two objects in direct contact becomes more
uniform as their temperatures equilibrate.
7. The paradox of the statistical mechanics is that elementary
D-interactions of molecules in a macroscopic object lead to an integral
T-interaction (thermal equilibration with the growth of entropy) in this
object.
8. In my opinion, to deal with this paradox it is important to introduce a
concept of contextual tokenization of information.
9. Any information can be perceived only in a context. Tokenization is a
process of information perception by a finite context, conscious or not.
10. Tokenization is a "method" of compressing information (data into
knowledge). An example of tokenization is human language, which is typically
based on a large but finite dictionary. The words are tokens and they
replace the expanded explanations of their meanings in a text.
11. Tokenization is where the main distinction between information,
complexity and entropy lies. To illustrate, Let us compare two uniform
systems. One is a vessel with gas, another is a crystal. Both systems
possess some sort of uniformity and, hence, low information content. Indeed,
in the gas vessel each small volume is macroscopically indistinguishable
from another. In an (ideal) crystal, each elementary cell is
insistinguishable from another. However, the crystal is a system with low
entropy while gas is that of high entropy. The answer is that
microscopically the crystal can be toenized into description of an
elementary cell plus the number of repetitions in three directions. In the
gas, however, microscopically all the elementary volumes are different and
it is IN PRINCIPLE impossible to describe each of them precisely. Hence,
tokenization is impossible and entropy is high.
12. Let us discuss the information content in In both systems. In the
crystal the information content is low, as the tokenization is "poor",
containing only the elementary cell structure and number of repeats. A
linguistic analogy may be a text where the same word is repeated many times,
and no other words are used.
13. In the gas vessel the information content is also low, but not because
of poor "tokenization" language but because, the system being "random",
tokenization fails. A linguistic analogy is a text in which each word
appears only once, hence its deciphering (without a dictionary) becomes
impossible.
14. Tokenization necessarily implies that the tokens repeat, and their
number is "tractable" by the context (receiver).
15. Hence there are two types of uniformity, that both have low information
context. One of them is highly entropic (infinite tokenization dictionary),
another is low entropy (small tokenization dictionary).
16. In between these two situations there should be examples of systems
which have high information context, i.e. allow large but finite
tokenization.
17. In the "crystal" case the token dictionary is small, but the
description in the "tokens" is large. In the "gas" case the token dictionary
is large but each token is used in the system only once.
18. We may speculate that if the token is used only once then this token is
not a true token. In fact, in human language words always refer to classes,
i.e. describe more than one object. The token is, in fact, a classification
factor.
19. A few stable D-interactions represent tokens of the system. Hence, they
convey information. Unstable D-interactions (the description of which is
divergingly complex, e.g. strange attractors), or divergingly large number
of thereof, produce divergent tokenization, and comprise T-interactions.
20. A shape is a token too, hence the linguistic discourse.....

Thanks for the labour of reading this rather eclectic passage. Hope some of
the thoughts presented here migh trigger discussion.

Yours, Igor

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 6 18:16:44 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET