RE: [Fis] Is all information also physical entropy?

From: Loet Leydesdorff <[email protected]>
Date: Fri 07 May 2004 - 08:00:22 CEST

Dear Michael,
 
It seems that we still disagree. I agree that everything can eventually
be described with physics, but I don't agree with the reductionism
implied in your vision. For example, if we consider the economy as a
stream of coins and banknotes, we don't understand the value of these
coins as different. One can write the values as another probability
distribution or as a second dimension of the same probability
distribution, etc., but the system of reference for the values is
different from the system of reference for the coins and banknotes. The
systems of reference determine the appropriate level of theorizing. For
example, in this case we need economic theorizing for understanding the
(non-linear) dynamics of the system under study. If one focuses on the
exchange processes in a monetary system, one does not have to worry
about the physics underlying the system, but can use the H for studying
the communication.
 
Thus, the H provides us with a mathematical apparatus (a mathematical
theory of communication) that can be recombined with special theories of
communication (for example, systems that circulate money). If one takes
the system of circulation of banknotes and coins as the system of
reference--like a national bank has to do--one worries about the physics
of the system. But if one assumes that the circulation is an economic
system, a different research agenda can be formulated. Thus, it is
useful to abstract from the physics and with hindsight the physical
dimension of higher-order systems can be considered as one dimension
among other.
 
I hope that this is helpful for clearing the misunderstanding.
 
With kind regards,
 
 
Loet
 
 
ps.
Let me note for the good order that different from Emery, the correct
equation is:
 
S = k N H
 
In addition to being indepent of the physical interpretation (k), H is
also independent of the number of units (N), while S is dependent of the
number of particles. H is only a statistics of the dividedness. (In a
closed system, it is then the case that Delta S = k Delta H.)

> -----Original Message-----
> From: fis-bounces@listas.unizar.es
> [mailto:fis-bounces@listas.unizar.es] On Behalf Of Michael Devereux
> Sent: Thursday, May 06, 2004 9:41 PM
> To: FIS Mailing List
> Subject: [Fis] Is all information also physical entropy?
>
>
> Dear Loet and colleagues,
> Thanks, Loet, for the message. I always enjoy learning from
> what you write. Are you familiar with the entropy calculation
> for a physical system
> which has degenerate energy levels? Emerys article was my
> tutorial for
> understanding how to make that computation properly. He
> specifies that
> every energy level, including every degenerate level, is individually
> summed in our familiar equation, Sum p ln(p).
> Before reading his prescription, I had thought that any
> levels with the
> same energy must be treated as just one physical state. So, I
> would sum
> the probabilities over each degenerate level to find the
> probability for
> that single compound degenerate state, and plug that
> probability into
> our familiar entropy equation. But, not so, according to
> Emery. And it
> does yield a different value for the entropy, doesnt it? Can you
> confirm Emerys formulation, Loet? Anyone else? Its something I had
> never considered before.
> Regarding the question of a distinction between thermodynamic entropy
> and informational entropy, I think, Loet, you and I are
> considering the
> problem at a different level. I believe Im starting to
> understand your
> use of the decomposition algorithm, with H(0) the in-between group
> uncertainty.
> I dont doubt that H = Sum p ln(p) is a mathematical formula
> that may be
> useful in many different applications, including as a part of some
> larger formulation, like the decomposition algorithm. But, my
> perspective is at the most elementary level for information,
> the level
> of an information bit. We know, from computer science, dont we, that
> any information, no matter how complex, may be represented as
> a series
> of bits. That would include whatever information one observes for the
> segregation of social systems, or the expected information flow in
> internet communication. I trust, Loet, youll agree with me that any
> macroscopic information, including internet conversations,
> etc., can
> be represented as some sequence of bits.
> I also trust youll agree that every bit of information is a physical
> configuration of some sort. If I receive even a single bit of
> information, it was conveyed to me by a structure that is
> physical. For
> example, a pencil mark on paper, an electrical voltage in a
> computers
> transistor, a light photon striking my eye, or a single bit
> represented
> by the occupation level of an electron in one of two levels
> of an atom.
> And our brains, too, store information as chemical and electrical
> signals. This is what Rolf Landauer meant when he said that
> information
> is physical.
> The Szilard engine represents a single bit by the location of
> an atom in
> either one or the other half of a cylinder. Did you know that it is
> Szilard, via the engine, who is credited with discovering the
> information bit. So, it was a physical thing from the beginning. One
> good source for the history and development of information theory is
> Grandys resource letter, available on the web at
> http://physics.uwyo.edu/~tgrandy/infophys/node1.html.
> As we know, Shannon calculated the amount of information
> conveyed by a
> series of electrical impulses in a telegraph line and
> rediscovered the
> formula H. I emphasize that this is the amount of information
> carried by
> a particular physical arrangement. It seems to me, then, that
> the only
> remaining question is whether the H which quantifies the amount of
> information in a physical configuration, describes that same
> property of
> a physical system as does its entropy, S. As you say, Loet,
> heat divided
> by temperature, without the appropriate conversion factor, K,
> certainly
> doesnt have the units of information. (But, both do describe the
> uncertainty, or improbability, in a physical system.) And I
> realize that
> Shannon wasnt willing to call his measure entropy, or negentropy,
> though Brillouin did it for him.
> The final piece of this argument is described by Grandy in
> his resource
> letter. He claims the case is now settled; that the property of some
> physical system measured by S, is, indeed, the same property
> calculated
> by the H formula (with an appropriate conversion factor, k).
> Grandy says
> that the ENTROPY of any physical system provides a
> quantitative measure
> of the AMOUNT OF INFORMATION needed to remove the uncertainty in the
> probability distribution for that system. Id emphasize that
> this is so
> for any physical system at all, whether a few bits of
> electrical energy
> in a computer register, or, say, all the information that can
> possibly
> be carried by an internet conversation, or by the exchange of
> cash. Thanks, Loet, for the response. It surely helped me
> clarify my ideas
> about the H formula.
> Very best regards,
> Michael
>
>
>
> _______________________________________________
> fis mailing list
> [email protected] http://webmail.unizar.es/mailman/listinfo/fis
>
Received on Fri May 7 10:41:49 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET