[Fis] Entropy and information

From: Karl Javorszky <[email protected]>
Date: Sat 17 Apr 2004 - 18:54:01 CEST

Please allow me to offer terminology in this highly hilosophical discussion.
Bob has unfolded the semantic parallels and dissonances between the concepts
"entropy", "information", "thermodynamics" and "statistical approaches". The
question is, as I read the red line of the conversation, what we understand
under these words and how the concepts understood form a pattern within our
brain to which we say: this is pleasant, this is how the world can be
understood well, this is a right explanation of the world; and then, in the
best case we find measurements with Nature that support our (newly, by means
of th explanation) established way of relating concepts to each other.
Nothing goes above numbers when offering something to agree on. We work now
on the membrane separating and joining mathematics, philosophy, information
theory, biotheory and fundamental physics. Let me try to offer a re-wording
of Bob's contribution by talking about numbers. The neutral language may
help.

Neumann and Shannon and many more refer to the everyday experience that
things tend to come in a most idle, convenient, effortless state. This is
the subjective, psychological way of putting the idea of entropy. The same
concept can be seen as a statistical trend for logical relations on a set to
be in or migrate towards their most probable state. Entropy means the most
probable state of a set.

Relative to the most usual (probable) state of the set, it is in any
realisation (moment) in a specific deviation to the idealised (most
probable) state. This is information. We set the information content of a
set in its most probable state to Zero and describe each actual state as (a
collection of ) extents of difference relative to the most probable state.
The deviation from the mean is opposed to the mean as information is opposed
to entropy (Shu Kun).

The phenomenology behind things has its roots in the organsational structure
of our brain: we treat tactile information (nervous excitement mitigated by
receptors in the skin) differently to non-tactile (thought-up) mental
excitements (concepts). Thus we treat "objects" and "logical relations" as
radically differing concepts. But, if we have (and we do have) the maximal
number of logical relations representable on n objects, the we also know the
minimal fraction of one object among n needed to represent a logical
relation on. This would be a statistical approach to the matter-energy
equivalence. Each logical relation counts as a fraction of an object. It has
been shown that the density of logical relations depends more on the inner
differentiation of objects of a set (how many distinct symbols are present
in the set: how individated the obejcts are) than on the number of objects
(within some circumstances). The proposal is to make one more abstractional
step, away from statistical mechanics and statistical thermodynamics and
statistical whatever towards pure statistical statistics. That is where the
stew brews.

If the numbers (the logical relations represented on a most probable set
consisting of n objects) themselves coagulate and densify into an object,
and then enforce distances and properties on the set, then we shall have
found a model for a statistical whatever. The abstract set can be made to
sing and dance (of course within constraints!) and do all of "uncertainty",
"indeterminacy" or even (heaven forbid!) "complexity", and one shall have a
relaxed and detached opportunity to decide what to call "entropy" and what
not.

As to constraints: The economical aspect of information theory has been
emphasized by Pedro Marijuan's counting of all available logical expressions
in the totality of logical interrelations on a set. As we deal with the
autoregulation of one (1) system, we are basically in a very non-infinite
multitude. The elementar expressions describing the system the functioning
of which we are trying to understand can be too few to maintain a logical
statement over a statistically relevant succession of time ticks (linear
neighbourhood steps). In his model one can envision competition for logical
constants among possible places within expressions. Pedro's curve shows the
availability of constants. Probability and economy meet in this idea.

In my understanding of the debate, the contributors have targeted a concept
of the most probable (usual) state of a set and its tendencies to deviate in
any of k dimensions away from the expectation value and reapproach it. This
is indeed a deeply probabilistic approach and there is a way to look at
natural numbers that will show the concepts in a clear fashion.

Karl
-----Ursprungliche Nachricht-----
Von: fis-bounces@listas.unizar.es [mailto:fis-bounces@listas.unizar.es]Im
Auftrag von Robert Ulanowicz
Gesendet: Dienstag, 13. April 2004 16:51
An: fis@listas.unizar.es
Betreff: Re: [Fis] FIS / introductory text / 5 April 2004

Please excuse my tardiness in joining the discussion initiated by
Michel. Perhaps when I am done with my idiosyncratic comments, several
of you will wish I had simply remained silent. :)

To begin with, most of us recall that Shannon named his measure "entropy"
after the humorous suggestion by von Neumann. Von Neumann was asked by
Shannon what to name the formula, H=-Sum(i) pi log(pi)? Von Neumann
replied rather coyly that he should name it entropy, because (1) the
measure is formally the same as that used by Boltzmann in his statistical
mechanical treatment of irreversibility, and (2) "Nobody really
understands what entropy is, so you will be at an advantage in any
discusssion!" It was a clever, mischievous response, but Shannon
apparently took him seriously, and it became a bad joke on all of us!

I was educated as a chemical engineer, and one of the shibboleths of
our upbringing was that thermodynamics is a purely phenomenological and
macroscopic endeavor. Any student attempting to break this taboo paid a
drastic price for his/her impertinence. Hence, at any formal exam, if a
student were responding to a question in thermodynamics and chanced to
mention the term "atom" or "molecule" a virtual gong would sound, the
student would be interrupted and told to leave the room. He/she had
failed and would be banished to where there is weeping and gnashing of
teeth! :)

The reasoning was that thermodynamics rests on a phenomenological
foundation that is far more solid than any atomic "hypothesis". After all,
in 1820 when Sadi Carnot described the behavior of engines that pumped
water from mines, he did so from a purely phenomenological perspective.
That his conception of irreversibility happened to collide with the
prevailing notions of conservation and temporal symmetry (which Aemelie
Noether later told us were equivalent) was of little consequence to an
engineer. But it put the physicists' view of the world in serious
jeopardy. For the next 30 to 50 years it was the atomic hypothesis, not
the second law (which, incidentally, was discovered before the first) that
was truly at risk.

We all know that this conflict was finally resolved when Boltzmann and
Gibbs constructed statistical mechanics as a bridge between
thermodynamics and physics -- or at least we think we know. As Michel
suggested, it was an exceedingly narrow bridge, indeed! Consider the
following: The positivist notion of science is that one should be
continually trying to disprove hypotheses -- to subject them to all
manner of rigorous tests. Well, the atomic hypothesis was clearly at
risk and what did we do? We discovered a miniscule range of phenomena
across which there was some correspondence between the two conflicting
disciplines and we immediately called a halt to the dialogue! We
considered the hypothesis proven.

I would submit that there is more to these rantings than mere fancy on
my part. Consider that whenever a physicist is asked about
thermodynamics, the response is almost invariably from the perspective
of statistical mechanics. For most physicists, STATISTICAL MECHANICS *IS*
THERMODYNAMICS. Thermodynamics in its phenomenological guise continues
to pose a major challenge to physics. It is the blackberry seed in the
wisdom tooth of the physicist. The response is to envelop
thermodynamics within the safe cocoon of stat mech.

As Shu-Kun pointed out, entropy and information are opposites, and we
would do well not to confound them. It is possible to measure entropy
purely in terms of energetic changes. In fact, the variables of
thermodynamics are all naturally skewed towards the energetic. Most
constraints enter thermo via external boundaries (although some
constraints are embedded to an extent in state variables like the Gibbs
free energy.)

Information, on the other hand, is dominated by constraint. (In fact, I
argued earlier that information could be understood purely in terms of
constraint.) Not that it is always necessary to know the details of the
constraints explicitly. Just as one can phenomenologically measure
thermodynamical variables in ignorance of molecular movements, one can
also gauge the level of information according to their effects and in
abstraction from knowing exactly how the constraints are working.

So I would conclude with a plea to respect the autonomy of
thermodynamics from statistical mechanics -- that whatever overlap
entropy might have with probabilistic notions pales in comparison with
its meaning in a physical, energetic sense. None of which is to
diminish the value and utility of Shannon's probabilistic measure,
which has been enormous. It's just a plea to call it something like
"uncertainty", "indeterminacy" or even (heaven forbid!) "complexity",
but at all costs to avoid calling it "entropy"!

Regards to all,
Bob

-------------------------------------------------------------------------
Robert E. Ulanowicz | Tel: (410) 326-7266
Chesapeake Biological Laboratory | FAX: (410) 326-7378
P.O. Box 38 | Email <ulan@cbl.umces.edu>
1 Williams Street | Web <http://www.cbl.umces.edu/~ulan>
Solomons, MD 20688-0038 |
--------------------------------------------------------------------------

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Sat Apr 17 19:06:10 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET