Is the sum (L) of information (I) and entropy (S) (L=I+S) ever conserved?

From: Dr. Shu-Kun Lin <[email protected]>
Date: Wed 15 May 2002 - 17:50:19 CEST

Dear FISers,

Lewis' remark that "gain of entropy means loss of information"
defines the relationship of entropy and information. If I can convert
to S, then a quantity L=I+S should be conserved. I used L to remember
Lewis.

I thought a lot along this line (see my recent writing at the
http://www.mdpi.org/ijms/htm/i2010010/i2010010.htm file). I am
pretty sure that all L, I and S are state functions. In thermodynamics
and in physics in general, they can be simply defined as
L=E/T, I=G/T, where E is total energy and G is
a potential energy, if a temperature can be formally ever defined.

In many cases, temperature may not be defined, L, I and S can also
be defined. That's why L, I and S can be applied to area other than physics.

Under what conditions L is conserved if ever? Any comments?

(Sorry for creating a new thread, though I found the discussions so
far have been very interesting. I am trying to understand the foundation
of information by finding its relation with many other parameters and
concepts, such as symmetry, similarity, and entropy, of of course.)

Shu-Kun

--
Dr. Shu-Kun Lin
Molecular Diversity Preservation International (MDPI)
Matthaeusstrasse 11, CH-4057 Basel, Switzerland
Tel. +41 79 322 3379, fax +41 61 302 8918
e-mail: lin@mdpi.org
http://www.mdpi.org/lin
Received on Wed May 15 16:52:00 2002

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:45 CET