Dear FISers,
This is my first posting in this list. I am a mechanical engineer working on damage accumulation phenomena. My main research line is the stochastic modeling of cumulative damage process. I am delighted with the comments posted by Shu-Kun related to information and entropy. He suggested two laws of revised information theory:
"The first law of information theory: the logarithmic function L (the sum
of entropy and information) of an isolated system remains unchanged.
The second law of information theory: Information I of an isolated system
decreases to a minimum at equilibrium."
In my modest opinion, Dr. Lin are right!. My assessment is based on the theory of finite random fields. That is, let m be any probability distribution on X (a finite set). The information gained by the observation of x (an element of X) is measured by the number -ln m(x). In informational theoretical terms this basically is the minimal number of yes-or-not questions we have to ask the observer in order to learn that x is observed. Considerer the entropy , and the mean energy of m. Then
where Z is called the partition function. The equality states when m is a Gibbs field. This inequality is well-known in statistical physics as the Gibbs variational principle. I think that the Shu-Kun's ideas can match this variational principle.
Best regards,
José Antonio
P.D: I apologize for my bad English (I am sorry)
--------------------------------------------------------------------------------
Dr José Antonio Bea Cascarosa
Prof. Titular Area M. M. C. y T. Estructuras (Associate professor Civil Engineering)
Dpto. Ingeniería Mecánica
Campus Actur (Edificio Agustín de Betancourt)
María de Luna, s/n
50018 Zaragoza (Spain)
Phone: (+34) 976.76.10.00 Ext. 5113
Fax: (+34) 976.76.25.78
Email: jabea@posta.unizar.es Web: http://www.cps.unizar.es/deps/IngMec/mmcyte/
This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET