Dear Loet and colleagues,
Thanks, Loet, for the message. I always enjoy learning from what you write.
Are you familiar with the entropy calculation for a physical system
which has degenerate energy levels? Emerys article was my tutorial for
understanding how to make that computation properly. He specifies that
every energy level, including every degenerate level, is individually
summed in our familiar equation, Sum p ln(p).
Before reading his prescription, I had thought that any levels with the
same energy must be treated as just one physical state. So, I would sum
the probabilities over each degenerate level to find the probability for
that single compound degenerate state, and plug that probability into
our familiar entropy equation. But, not so, according to Emery. And it
does yield a different value for the entropy, doesnt it? Can you
confirm Emerys formulation, Loet? Anyone else? Its something I had
never considered before.
Regarding the question of a distinction between thermodynamic entropy
and informational entropy, I think, Loet, you and I are considering the
problem at a different level. I believe Im starting to understand your
use of the decomposition algorithm, with H(0) the in-between group
uncertainty.
I dont doubt that H = Sum p ln(p) is a mathematical formula that may be
useful in many different applications, including as a part of some
larger formulation, like the decomposition algorithm. But, my
perspective is at the most elementary level for information, the level
of an information bit. We know, from computer science, dont we, that
any information, no matter how complex, may be represented as a series
of bits. That would include whatever information one observes for the
segregation of social systems, or the expected information flow in
internet communication. I trust, Loet, youll agree with me that any
macroscopic information, including internet conversations, etc., can
be represented as some sequence of bits.
I also trust youll agree that every bit of information is a physical
configuration of some sort. If I receive even a single bit of
information, it was conveyed to me by a structure that is physical. For
example, a pencil mark on paper, an electrical voltage in a computers
transistor, a light photon striking my eye, or a single bit represented
by the occupation level of an electron in one of two levels of an atom.
And our brains, too, store information as chemical and electrical
signals. This is what Rolf Landauer meant when he said that information
is physical.
The Szilard engine represents a single bit by the location of an atom in
either one or the other half of a cylinder. Did you know that it is
Szilard, via the engine, who is credited with discovering the
information bit. So, it was a physical thing from the beginning. One
good source for the history and development of information theory is
Grandys resource letter, available on the web at
http://physics.uwyo.edu/~tgrandy/infophys/node1.html.
As we know, Shannon calculated the amount of information conveyed by a
series of electrical impulses in a telegraph line and rediscovered the
formula H. I emphasize that this is the amount of information carried by
a particular physical arrangement. It seems to me, then, that the only
remaining question is whether the H which quantifies the amount of
information in a physical configuration, describes that same property of
a physical system as does its entropy, S. As you say, Loet, heat divided
by temperature, without the appropriate conversion factor, K, certainly
doesnt have the units of information. (But, both do describe the
uncertainty, or improbability, in a physical system.) And I realize that
Shannon wasnt willing to call his measure entropy, or negentropy,
though Brillouin did it for him.
The final piece of this argument is described by Grandy in his resource
letter. He claims the case is now settled; that the property of some
physical system measured by S, is, indeed, the same property calculated
by the H formula (with an appropriate conversion factor, k). Grandy says
that the ENTROPY of any physical system provides a quantitative measure
of the AMOUNT OF INFORMATION needed to remove the uncertainty in the
probability distribution for that system. Id emphasize that this is so
for any physical system at all, whether a few bits of electrical energy
in a computer register, or, say, all the information that can possibly
be carried by an internet conversation, or by the exchange of cash.
Thanks, Loet, for the response. It surely helped me clarify my ideas
about the H formula.
Very best regards,
Michael
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Thu May 6 21:42:58 2004
This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET