Please excuse my tardiness in joining the discussion initiated by
Michel. Perhaps when I am done with my idiosyncratic comments, several
of you will wish I had simply remained silent. :)
To begin with, most of us recall that Shannon named his measure "entropy"
after the humorous suggestion by von Neumann. Von Neumann was asked by
Shannon what to name the formula, H=-Sum(i) pi log(pi)? Von Neumann
replied rather coyly that he should name it entropy, because (1) the
measure is formally the same as that used by Boltzmann in his statistical
mechanical treatment of irreversibility, and (2) "Nobody really
understands what entropy is, so you will be at an advantage in any
discusssion!" It was a clever, mischievous response, but Shannon
apparently took him seriously, and it became a bad joke on all of us!
I was educated as a chemical engineer, and one of the shibboleths of
our upbringing was that thermodynamics is a purely phenomenological and
macroscopic endeavor. Any student attempting to break this taboo paid a
drastic price for his/her impertinence. Hence, at any formal exam, if a
student were responding to a question in thermodynamics and chanced to
mention the term "atom" or "molecule" a virtual gong would sound, the
student would be interrupted and told to leave the room. He/she had
failed and would be banished to where there is weeping and gnashing of
teeth! :)
The reasoning was that thermodynamics rests on a phenomenological
foundation that is far more solid than any atomic "hypothesis". After all,
in 1820 when Sadi Carnot described the behavior of engines that pumped
water from mines, he did so from a purely phenomenological perspective.
That his conception of irreversibility happened to collide with the
prevailing notions of conservation and temporal symmetry (which Aemelie
Noether later told us were equivalent) was of little consequence to an
engineer. But it put the physicists' view of the world in serious
jeopardy. For the next 30 to 50 years it was the atomic hypothesis, not
the second law (which, incidentally, was discovered before the first) that
was truly at risk.
We all know that this conflict was finally resolved when Boltzmann and
Gibbs constructed statistical mechanics as a bridge between
thermodynamics and physics -- or at least we think we know. As Michel
suggested, it was an exceedingly narrow bridge, indeed! Consider the
following: The positivist notion of science is that one should be
continually trying to disprove hypotheses -- to subject them to all
manner of rigorous tests. Well, the atomic hypothesis was clearly at
risk and what did we do? We discovered a miniscule range of phenomena
across which there was some correspondence between the two conflicting
disciplines and we immediately called a halt to the dialogue! We
considered the hypothesis proven.
I would submit that there is more to these rantings than mere fancy on
my part. Consider that whenever a physicist is asked about
thermodynamics, the response is almost invariably from the perspective
of statistical mechanics. For most physicists, STATISTICAL MECHANICS *IS*
THERMODYNAMICS. Thermodynamics in its phenomenological guise continues
to pose a major challenge to physics. It is the blackberry seed in the
wisdom tooth of the physicist. The response is to envelop
thermodynamics within the safe cocoon of stat mech.
As Shu-Kun pointed out, entropy and information are opposites, and we
would do well not to confound them. It is possible to measure entropy
purely in terms of energetic changes. In fact, the variables of
thermodynamics are all naturally skewed towards the energetic. Most
constraints enter thermo via external boundaries (although some
constraints are embedded to an extent in state variables like the Gibbs
free energy.)
Information, on the other hand, is dominated by constraint. (In fact, I
argued earlier that information could be understood purely in terms of
constraint.) Not that it is always necessary to know the details of the
constraints explicitly. Just as one can phenomenologically measure
thermodynamical variables in ignorance of molecular movements, one can
also gauge the level of information according to their effects and in
abstraction from knowing exactly how the constraints are working.
So I would conclude with a plea to respect the autonomy of
thermodynamics from statistical mechanics -- that whatever overlap
entropy might have with probabilistic notions pales in comparison with
its meaning in a physical, energetic sense. None of which is to
diminish the value and utility of Shannon's probabilistic measure,
which has been enormous. It's just a plea to call it something like
"uncertainty", "indeterminacy" or even (heaven forbid!) "complexity",
but at all costs to avoid calling it "entropy"!
Regards to all,
Bob
-------------------------------------------------------------------------
Robert E. Ulanowicz | Tel: (410) 326-7266
Chesapeake Biological Laboratory | FAX: (410) 326-7378
P.O. Box 38 | Email <ulan@cbl.umces.edu>
1 Williams Street | Web <http://www.cbl.umces.edu/~ulan>
Solomons, MD 20688-0038 |
--------------------------------------------------------------------------
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 13 16:53:08 2004
This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET