[Fis] entropy as average expected loss

From: <[email protected]>
Date: Tue 13 Apr 2004 - 21:48:19 CEST

Here I am condensing my replies.

* 1. Entropy fluctuations:

M. Devereux wrote:
> question is, where is the new information to run the engine,
> if the memory register always shows a different, but random,
> distribution of R and L? The answer is this: after the
> particular measurement which permits engine operation, the
> register repeats the same sequence of R and L, then repeats
> it again, and again, etc.
> I interpret this as a time signal. At any instant in time,
> there is nothing to distinguish one register output from
> another, but when the sequence is repeated, in time, the
> engine operator recognizes an improbable occurrence in time
> which actually indicates the operating fluid of the engine is
> compressed to a particular side of each cylinder, and can now
> do macroscopic work. I think this improbable occurrence in
> time indicates an entropy decrease. Does anyone else buy that
> argument? Or not?

Denis Evans has been working on very publicized "fluctuation theorem" which
says that entropy fluctuates: "As the time or system size increases (since �
is extensive), the probability of observing an entropy production opposite
to that dictated by the second law of thermodynamics decreases
exponentially." http://en.wikipedia.org/wiki/Fluctuation_theorem
http://rsc.anu.edu.au/~evans/

I buy your argument, but the fluctuation theorem seems to limit the utility
of a Szilard engine.

* 2. Entropy or entropies

Robert wrote:
> The reasoning was that thermodynamics rests on a
> phenomenological foundation that is far more solid than any
> atomic "hypothesis". After all, in 1820 when Sadi Carnot
> described the behavior of engines that pumped water from
> mines, he did so from a purely phenomenological perspective.

I agree with you that entropy is a terminological nightmare that will haunt
us until we get it solved.

Clausius' (thermodynamic) entropy is very different from Shannon's entropy.
Thermodynamic entropy is a narrow concept. Thermodynamic entropy is
something one can measure. Shannon's entropy is instead a general one. It is
a property of a probabilistic model. Boltzmann's work was trying to explain
phenomenological entropy with a probabilistic model. Atomistics is *not* a
hypothesis. Atomistics is about trying to explain the underlying phenomena
through modelling and simulation rather than by measurement. Nobody pretends
that Boltzmann's model is the ultimate one. It is just a paradigm (model,
not measure). It was tested, and it was proved that the paradigm is not
misguided.

My view is probably biased, but when today someone mentions "entropy" in
circles other them chemistry, thermodynamics or philosophy, the default
interpretation is no longer the thermodynamic one, but instead Shannon's. I
cannot even envision what thermodynamic information could be, as others on
this list.

What to do now? To begin an effort to use two different terms, withdrawing
one or the other? But at all times, everyone using the word "entropy" should
beware that there are several definitions, sometimes connected sometimes
disconnected. Who *owns* "entropy"?

* 3. Meaning

Pedro suggests that meaning is the invariant part of a concept (the
essence), while Igor considers meaning to be essentially an interaction
between a mind and the world (the grounding of a symbol). Both views are
consistent, useful, and necessary two faces of the same concept. I'd take
them as complementary. When I say "meaning of life", I'm invoking Pedro's
definition. When I ask "what did you mean?", I'm invoking Igor's definition.
The essence is about trying to filter out the rule/model from the data, and
the grounding is trying to apply the rule/model to the data.

---
mag. Aleks Jakulin
http://www.ailab.si/aleks/
Artificial Intelligence Laboratory, 
Faculty of Computer and Information Science, University of Ljubljana. 
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 13 21:52:56 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET