RE: [Fis] Szilard's Engine and Information

From: <[email protected]>
Date: Tue 06 Apr 2004 - 17:56:44 CEST

M. Devereux wrote:
> It's this time signal which causes the decrease
> in apparatus entropy, since it specifies the captured
> molecule's position. No heat is transferred to or from the
> apparatus at measurement, so it shows no change in Clausius'
> entropy, Delta Q / T. I'm unaware of any definition of
> entropy which explicitly includes a time dependence. Does
> anyone else know of such?

Although I am no physicist, this reminds me of a problem referred to as
'concept drift'. Assume a doctor trying to cure patients through time. He
observes that lard is associated with cardiac arrest, and advises his
patients about this. Because of this observation, the association between
lard and cardiac arrest may change. Something has changed at some point in
time. To properly account for this, we need to include another variable:
does a patient know about the association between lard and cardiac arrest.
The patient's awareness of lard affects the definition of disease. I am sure
that biologists, sociologists and others will acknowledge this problem: What
is the meaning of concept "dog" at 1000 BC? What is the meaning of concept
"dog" at 1000 AD? What is the meaning of concept "dog" now?

In natural science, however, time is usually taken to be a special concept.
Most of physics seems to be based on the hidden assumption that nothing
affects time, and that laws are invariant with respect to time. (Another
such special invariant concept is space. Constants are also considered to be
time-invariant. And so on.) Still, we may treat time as any other variable,
and observe what may happen.

If X are the variables indicating the state of the system, the usual
informational entropy is H(X|t) -- entropy of state X at time point t. How
to we include time as a variable? The trouble is that to compute the joint
entropy, one needs a joint probability density function. To obtain
probability, one needs to aggregate a number of observations with the same
variable value. It is therefore hard to compute time-joint entropy at a
crisp instant. There are three ways of addressing this problem:

1. We may give up some precision in time, and consider an interval [t1,t2]
as a single value of time. However, if we observe all the states in a
period, it is possible to obtain the probabilities P(X,[t1,t2]),
P(X,[t2,t3]), etc., and from these the joint entropy H(X,T) :=
-Integrate[P(x,t)Log P(x,t)]dxdt

2. Alternatively, we may perform several identical experiments in parallel
(assuming that the experiment is independent of the position of the
experiment, and that all of them are identically configured), and observe
the state Xi of the system i at the same time t. From this we have P(X1,t),
P(X2,t), and so on.

3. We perform these identical experiments consecutively, at times T1,T2,...,
observing P(Xi,Ti+t). Here we assume identical starting conditions and
invariance of the concept with respect to the starting time.

I hope I am not reinventing the wheel, but maybe this path could be followed
to address your concern.

Best regards,

Aleks

---
mag. Aleks Jakulin
http://ai.fri.uni-lj.si/aleks/
Artificial Intelligence Laboratory, 
Faculty of Computer and Information Science, University of Ljubljana. 
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 6 17:57:43 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET