Dear Michael,
Thank you very much for your thoughtful paper which I read from the
first line to the last line. I would like to bring
to your attention our very recent special issue published in ENTROPY
on the topic "Quantum Limits to the Second Law of Thermodynamics"
at http://www.mdpi.net/entropy/list04.htm.
You are invited to contribute this paper, after revision, on this
related topic to
ENTROPY for consideration and publication. Please send your manuscript
by e-mail to Michel (E-mail: entropy@mdpi.org,
petitjean@itodys.jussieu.fr).
Other colleagues, if you have a long piece, please send to Michel
for possible publication. Michel is the Editor-in-Chief of ENTROPY.
You know Michel is the "Chair" of the on-going session we are attending
at FIS
right now.
Regarding the related entropy of mixing (Delta S), it is certain that
the entropy of mixing is an information theoretical entropy
because there is no heat involved. It should not be taken as a typical
thermodynamic entropy (Delta S). Mixing of two chiral molecules
gas R and gas L you mentioned cannot be a thermal process. Therefore,
it is not a thermodynamic process in an heat engine. Mixing of R
and L cannot be used to generate mechanical work. This is a fact.
When we discuss the engine and related possibility of energy
conservation, this fact
must be kept in mind.
If the mixing of gas R and gas L would create work (a kind of mechanical
energy
calculated as distance times force), one should be able to also create
mechanical work by
mixing red color and black color.
(Pedro, I will also be silent until the next week).
Best regards,
Shu-Kun
Michael Devereux wrote:
> Dear Colleagues,
> I’ve attached a PDF version of this message for those whose browser
> doesn’t do a good job of reproducing text formatting.
> I am becoming more and more aware of the interest and help provided by
> other academic disciplines toward my understanding of information and
> entropy, and I appreciate Michel’s introduction in that regard. May I
> suggest a powerful model from physics for understanding the
> information-entropy relationship, at least as we physicists often use
> those terms in thermodynamics and measurement? ( I was an experimental
> particle physicist once, at the Los Alamos meson factory, then, in
> Switzerland working for the ETH on elementary particle experiments.
> I’ve taught at various universities in the U.S., and am now
> semi-retired. But, I’ve retained an abiding interest in quantum
> measurement, which has led to analysis of the Szilard engine and to
> calculation of the entropy cost of information processing. I expect I
> may return to full time work in some field related to quantum
> measurement or quantum information, if such an opportunity arrives.
> So, my experience and perspective is largely as a practicing
> physicist.) I understand that this model won’t address many of those
> concepts enumerated by Michel.
> By far, the most effective and powerful physics model I know for
> investigating the relationship between thermodynamic entropy and
> information is the Szilard engine (Szilard, Z. Physik, 53, 1929, p.
> 840; Devereux, Found. Phys. Lett. 16, 1, 2003, p. 41, also available
> on the web from the Entropy server). Szilard discovered a very simple
> model that serves both as a heat engine and an information engine. It
> bridges the gap between a macroscopic measuring apparatus and
> microscopic information bit. Can anyone suggest a better model from
> physics? I’ve found it extremely useful for understanding the cost, in
> terms of energy and entropy, of quantum measurement, as well as the
> thermodynamic entropy needed for information processing. Specifically,
> the “information” here is the location, from measurement, of a single
> gas molecule in one or another half of a macroscopic cylinder.
> But, the fatal problem with use of the Szilard engine model had been
> mistaken analysis of the engine cycle. Szilard suggested that the
> apparatus for measuring location of the gas molecule within the engine
> cylinder must produce entropy at measurement, in order to protect the
> Second Law of Thermodynamics. For nearly seventy-five years now,
> scientists have elaborated, expanded, and formalized that idea. (Zurek
> in Frontiers of Nonequillibrium Statistical Physics, 1984, p. 151;
> Lubkin, Int. J. Theo. Phys. 26, 1987, pp. 523-535; Brillouin, J. App.
> Phys., 22, 3, 1951, p. 334; etc. )
> But, Szilard’s suggestion is mistaken, and quite obviously so, with a
> little careful consideration. In his model, after measurement of the
> location, either R or L, of the molecule in one side or the other of
> each of N cylinders of this engine, the memory register of the
> measurement apparatus will indicate, for example, (R, L, R, R, L,
> R,.....). That information is now fixed; it does not change over time
> with fluctuations of the thermodynamic system, as the engine’s gas
> molecule continues to collide with the cylinder. So, the
> information-dependent entropy of the apparatus is zero. Analogous, for
> example, to an ideal gas with the position and momentum of each gas
> molecule fixed at specific values. That entropy is zero. (As
> physicists calculate such things.)
> Thus, the apparatus entropy could not have increased with measurement,
> and the Second Law is not protected from Maxwell’s demon by an
> apparatus entropy increase. Two philosophers of science, Earman and
> Norton, (Stud. Hist. Phil. Mod. Phys., Vol. 30, No. 1, pp. 1-40, 1999)
> published a quite extensive review of attempts to defend the Second
> Law from Maxwell’s demon with information-generated entropy, and found
> no creditable affirmative arguments. (Zurek, in the article cited
> above, wrote that the measurement outcome is unknown to some external
> observer, and so, the apparatus entropy is seen to increase by that
> observer. But, the demon observer must know the measurement outcome
> for every cylinder, in order to run the engine, and so he finds zero
> apparatus entropy. As we all know, no scientific law, including the
> Second Law, is observer dependent. If the demon observes no increase
> in entropy, then all external observers must also find no increase.
> This was one of my arguments against Zurek’s analysis. Earman and
> Norton offered another. I think Zurek’s idea is similar to von
> Neumann’s notion that quantum measurements are only completed when
> they become conscious to a human observer.)
> I calculated the entropy change of the Szilard measurement apparatus
> (cited above). That entropy actually DECREASES by Nk ln(2) at
> measurement, indicating an apparatus (thermodynamic) information
> increase. Recall, however, from Maxwell’s original idea, that the
> demon operating such an engine not only measures the properties of the
> gas molecules, he also must move the cylinder partition. The Second
> Law would not be violated if such partition activation produced
> sufficient entropy. I believe this effect may be general for the usual
> type of Maxwell demons: that it is partition movement (door closure,
> etc.) which generates the entropy prescribed by the Second Law, not
> the entropy which is associated with the information of measurement.
> In spite of the innumerable publications claiming otherwise.
> I’m somewhere near the middle of the calculation of the entropy
> generated by partition movement in Szilard’s engine. I think I can
> make that calculation quite general, and it appears, then, that I can
> use that result to determine the fundamental entropy cost of
> information processing. I mean what I’ve understood Charles Bennett
> and Rolf Landauer to mean by such things: entropy in terms of, say,
> the Gibbs formulation, and information processing as the logical
> manipulation of an information bit (erasure, overwriting, etc.). (The
> renowned philosopher of science, Karl Popper, thought that the Szilard
> engine could be operated without measurement, and thus, with no
> information transfer. If so, the engine might teach nothing about the
> relationship of entropy to information. But, Zurek displayed the
> quantum mechanical calculation for closure of the partition. The
> engine gas is not confined to half the cylinder by partition closure
> alone, but rather by the quantum measurement which follows closure.
> So, there must be information transfer to run the engine.) And, recall
> that the Szilard engine has been used by Charles Bennett and others as
> the model for information stored in a memory register. And, removal of
> the engine partition, followed by the measurement which reduces the
> quantum wave function, erases the information originally stored in the
> (engine) register (R,L,R,R,L,R,L,....).
> Calculation of the entropy produced by partition closure isn’t quite
> as simple as I once assumed, though the results now seem to give the
> value I had anticipated. I do think it’s clear, in general, that
> prompt movement of a physical object, like the engine partition, will
> produce thermodynamic entropy. Consider the movement of the cylinder
> in a heat engine, for example. (No quasi-static processes allowed.)
> The calculation, so far, is incomplete, but it has shown something I
> find really remarkable. The specific information needed to run the
> Szilard heat engine is carried by a time signal, not by change in a
> macroscopic physical configuration of any type. It’s this time signal
> which causes the decrease in apparatus entropy, since it specifies the
> captured molecule’s position. No heat is transferred to or from the
> apparatus at measurement, so it shows no change in Clausius’ entropy,
> Delta Q / T. I’m unaware of any definition of entropy which explicitly
> includes a time dependence. Does anyone else know of such?
> I think there is, however, a change in entropy, (and, of thermodynamic
> information) as determined by the Shannon-von Neumann probability
> formulation. I suspect that, in determining the probabilities we
> assign to thermodynamic configurations, we must include those that are
> time-dependent. And, I’ve found that the entropy cost of partition
> movement also depends on the time employed for that movement.
> Thank you, Michel, for the introduction and direction for this
> discussion.
>
> Cordially,
>
> Michael Devereux
>
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 6 16:13:27 2004
This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET