Re: Again on physics, entropy and information

From: Dr. Shu-Kun Lin <[email protected]>
Date: Thu 30 May 2002 - 01:57:24 CEST

Dear Werner and other FISers,

What is observation? In Werner's word, maybe I can put it as
interaction between sender (the concerned system) and the observer.
The job is the recording of the findings (the raw experimental facts).
I still believe information is such an parameter that there should be
no difference among different observers.

The difference is not the observer, it is afterwards --
the research (data reduction). In my opinion research is just
"date compression". You cannot bring all you observed to
the library. The best researcher is the best date compressor. A smart
compressor just search for all kinds of symmetries (periodicity, regularity--
the so-called natural laws--as we discussed, repetition, homology, identity,
equality, homogeneity, equilibrium, etc.). Symmetries make data
compression possible. If someone want to prepare a PhD thesis
entitled "Information and Beauty of Arts" as an interdisciplinary
topic (to say art physics, very much like the very hot finance physics),
I suggest the student to observe some postimpressionism paintings.
In many modern arts museum, you can always find a couple of
paintings which have nothing on them but same pure white color
everywhere. It has obviously the highest symmetry. How much information
in such kind of paintings? 1 bit only! The best data compression!
In the discussion and concluding remarks, the student may comment
that such kinds of paintings are emperor's new clothes, etc.
The Halley's Comet was observed in China and the data was recorded . However, the
stupid ancient Chinese observers never did the data compression.

Theoretical physicists are compressing data every day that observed by
experimentalists.

If entropy is a property of a system itself, like mass, energy,
momentum in physics, information must also be a property
of a system itself, like mass, energy, momentum in physics.
Entropy and information have same kind of unit (bit). If they use
similar expressions of Boltzmann, Gibbs, Shannon, Janes, etc., and also
"flow" together, how and why "information flow" is completely and fundamentally
different from "entropy flow"? Why information
is suddenly so much different? These are questions for Werner.

Shu-Kun

--
Dr. Shu-Kun Lin
Molecular Diversity Preservation International (MDPI)
Matthaeusstrasse 11, CH-4057 Basel, Switzerland
Tel. +41 79 322 3379, fax +41 61 302 8918
e-mail: lin@mdpi.org
http://www.mdpi.org/lin
Received on Thu May 30 01:58:35 2002

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET