Re: [Fis] Re: miscellanea / temperature / symmetry: Sent again: editeda little bit for long lines

From: Shu-Kun Lin <[email protected]>
Date: Sat 24 Apr 2004 - 11:24:39 CEST

My Dear Colleagues,

Sent again: edited a little bit for long lines. Because I am busy with
other business, I will be silent for very long time. Thanks!

Resent to FIS list. Sent at 4/22/2004 11:25 PM

Dear Loet,

1. Information (or entropy) can be the same whether we
paint black ink on white paper or put white paint on a black paper
and they might even have nothing to do with symmetry. Is this what
you mean when you say (1,1,2,2,3,3) or (3,3,2,2,1,1) or other kinds of
data array are the same value of entropy (or information)?

Symmetry has been a math concept. However, I can quickly find many
facts that symmetric structures are more stable than other structures.
This observation may suggest that higher symmetry may imply higher
entropy and (less information) for a static structure which has
certain geometry. One example is the state when the pressure P is
the same on two parts of a gas chambers separated by a wall. The
symmetric state is the state where P is the same on both sides.
Then, the thermodynamic entropy is also the highest when P is the same.
I can give you many other examples in thermodynamics and in mechanics.

2. All observed facts from many examples show that constraints define
a static structure (even for fluids). If there are some structures which
can be made more symmetric, there is information in these structures
which can be reduced. An example, again, is the ideal gas in two parts.

If there are a series of static structures (with different symmetries)
with the entropy S and their energy (E) values different, we can plot
S - E (S is x axis, E is y axis). The slope is called temperature.
In thermodynamics E=kT(S)+F. Because slope can be negative, therefore,
temperature T can be negative. For a gas, heat it with increase both
E and S, therefore T is positive.

3. Sometimes, it is not necessary to introduce new concepts. For instance,
when we plot S - E (S is x axis, E is y axis), the x axis can be
information (I). I agree with you that here, that "half empty" and
"half full" are the same meaning.

However, as I pointed out, the introduction of "symmetry" concept
into our information consideration is useful because it can be
connected to stability. "Symmetry" is a preferred concept because it
has been a mathematical representation of (geometric) structures.

Not only the well known symmetry concept itself is introduced. More
concepts can be introduced:

---Because (static) information is accepted, if its loss is defined as
entropy, we can introduce "static entropy" concept. "Static entropy"
is not thermodynamic entropy, it is better to add "static" before
entropy.

Therefore, we have two sets of concepts:
static symmetry, static entropy, etc. and
dynamic symmetry, (dynamic) entropy, etc.
for us to characterize two kinds of systems or structures and their
stabilities.

---Because for some mechanical systems when energy is lower,
the symmetry (and S) is higher, for a series of structures, a
graphics of S - E (S is x axis, E is y axis) can be produced where the
slope can be found as negative. Therefore, we have a negative
temperature, well defined as the slope. Of course this is not measured
by a thermometer and it is not a thermodynamic temperature.

Both symmetry and entropy are macroscopic measure. They can be
calculated from the microscopic states which care statistically
(probabilistically) distributed. The mixing of many microscopic states
(microstates) define the macroscopic homogeneity or macroscopic
isotropicity, etc., all are symmetries for a dynamic system. How many
microstates? If it is denoted by w, logw is entropy. This w can be
called symmetry number, for a dynamic system.

By the way, please pay attention to our journal ENTROPY where the very
first full paper is on the topic of temperature, see:
http://www.mdpi.net/entropy/list99.htm or more specifically the
pdf file at the http://www.mdpi.net/entropy/papers/e1010004.pdf
address.

Finally, I also agree with you that, while entropy is only a number,
symmetry in the form of group theory and matrix algebra can give a
more detailed description of the structure. However, a symmetry
number is enough to define relative stability.

Shannon's H (or Boltzmann's H) means information and also means
entropy. This makes discussion impossible in some cases. It is
better to use I for information , S for entropy. Transformation
between I and S is trivial (if we define L=S+I)
when we are talking about Shannon's H. However, we must get use to
such kind of definition of relations (kind of transformation). This
good habit can be very useful: if a set of raw data H is actually a
sinusoid H = sin t in a time (t) domain, a chart of 1000 km long is
not long enough to record it completely. This does not mean that this
H has a tremendous amount of information. Through a simple Fourier
transform (a kind of pattern recognition) we may find it as
a single pick in the frequency domain which can be recorded in a
small piece of paper. This very large amount of raw data can be
compressed into a small amount of data because of its periodicity of
the sinusoid, a kind of symmetry. Symmetry makes data compression
possible. That is why I prefer to define information (I) as the
compressed data.

(Thanks, Guy!)

Best regards,
Shu-Kun

-- 
Dr. Shu-Kun Lin
Molecular Diversity Preservation International (MDPI)
Matthaeusstrasse 11, CH-4057 Basel, Switzerland
Tel. +41 61 683 7734 (office)
Tel. +41 79 322 3379 (mobile)
Fax +41 61 302 8918
E-mail: lin@mdpi.org
http://www.mdpi.org/lin/
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Sat Apr 24 11:25:48 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET