RE: [Fis] FIS / introductory text / 5 April 2004

From: Loet Leydesdorff <[email protected]>
Date: Mon 05 Apr 2004 - 17:38:56 CEST

Dear Shu-Kun,

I was just answering Michel's question for a suggestion about the
relation with reference to Ebeling's book. The formula in the
Szilard-Brillouin relation is:

Delta S >= k(B) Delta H

H is Shannon's H; k(B) the Boltzmann constant ( 1.381 * 10^-33 J/K), and
S the thermodynamic entropy in J/K. I am not a physicists, but I just
answered the question. The derivation is provided by Ebeling. I can
reproduce it if you wish.

Otherwise, I agreed largely with Michel's introductory paper. I don't
understand your formula of Delta S > - Delta I. How is I defined? Can
you give the formula? (Your second email was even more confusing.)

With kind regards,

Loet

  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/

 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society

> -----Original Message-----
> From: fis-bounces@listas.unizar.es
> [ <mailto:fis-bounces@listas.unizar.es>
mailto:fis-bounces@listas.unizar.es] On Behalf Of Dr. Shu-Kun Lin
> Sent: Monday, April 05, 2004 3:53 PM
> To: fis@listas.unizar.es
> Subject: Re: [Fis] FIS / introductory text / 5 April 2004
>
>
> Dear Loet,
>
> You mean entropy S is equal to information (I), or almost
> equal. Can you
> still give
> a little bit of sympathy to the relation that Delta S > - Delta
> (information) and make some
> comments on this different relation? If we can agree on the
> relation that Delta S > - Delta (information), then we are
> ready to ask "why information loss is related to entropy", a
> question asked by
> physicists at the
> <http://www.lns.cornell.edu/spr/2000-12/msg0030047.html>
http://www.lns.cornell.edu/spr/2000-12/msg0030047.html
> website, and try to answer.
>
> Michel, thank you for your introduction.
>
> Shu-Kun
>
>
> Loet Leydesdorff wrote:
>
> >Dear Michel,
> >
> >The relation between thermodynamic entropy and the information is
> >provided by the Szilard-Brillouin relation as follows:
> >
> >Delta S >= k(B) Delta H
> >
> >(W. Ebeling. Chaos, Ordnung und Information. Frankfurt a.M.: Harri
> >Deutsch Thun, 1991, at p. 60.)
> >
> >k(B) in this formula is the Boltzmann constant. Thus, a
> physical change
> >of the system can provide an information, but it does not have to.
> >Unlike the thermodynamic entropy, probabilistic entropy has no
> >dimensionality (because it is mathematically defined). The Boltmann
> >constant takes care of the correction in the dimensionality in the
> >equation.
> >
> >When applied as a statistics to other systems (e.g.,
> biological ones)
> >one obtains another (specific) theory of communication in
> which one can
> >perhaps find another relation between the (in this case biological)
> >information and the probabilistic entropy. This can be
> elaborated for
> >each specific domain.
> >
> >With kind regards,
> >
> >
> >Loet
> >
> >
> >
> > _____
> >
> >Loet Leydesdorff
> >Science & Technology Dynamics, University of Amsterdam
> Amsterdam School
> >of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX
> >Amsterdam
> >Tel.: +31-20-525 6598; fax: +31-20-525 3681
> >< <mailto:loet@leydesdorff.net> mailto:loet@leydesdorff.net>
loet@leydesdorff.net;
> >< <http://www.leydesdorff.net/> http://www.leydesdorff.net/>
<http://www.leydesdorff.net> http://www.leydesdorff.net
> >
> >
> >-----Original Message-----
> >From: fis-bounces@listas.unizar.es
> >[ <mailto:fis-bounces@listas.unizar.es>
mailto:fis-bounces@listas.unizar.es]
> >On Behalf Of Michel Petitjean
> >Sent: Monday, April 05, 2004 9:15 AM
> >To: fis@listas.unizar.es
> >Subject: [Fis] FIS / introductory text / 5 April 2004
> >
> >
> >2004 FIS session introductory text.
> >
> >Dear FISers,
> >
> >I would like to thank Pedro Marijuan for his kind invitation
> to chair
> >the 2004 FIS session. The session is focussed on "Entropy and
> >Information". It is vast, that I am afraid to be able only to evoke
> >some general aspects, discarding specific technical developments.
> >
> > Entropy and Information: two polymorphic concepts.
> >
> >Although these two concepts are undoubtly related, they have
> different
> >stories.
> >
> >Let us consider first the Information concept.
> >There was many discussions in the FIS list about the meaning of
> >Information. Clearly, there are several definitions. The information
> >concept that most people have in mind is outside the scope of this
> >text: is it born with Computer Sciences, or is it born with
> Press, or
> >does it exist since a so long time that nobody could date it?
> >Neglecting the definitions from the dictionnaries (for each language
> >and culture), I would say that anybody has his own concept.
> >Philosophers and historians have to look. The content of the FIS
> >archives suggests that the field is vast.
> >
> >Now let us look to scientific definitions. Those arising from
> >mathematics are rigorous, but have different meanings. An example is
> >the information concept emerging from information theory (Hartley,
> >Wiener, Shannon, Renyi,...). This concept, which arises from
> >probability theory, has little connections with the Fisher
> information,
> >which arises also from probability theory. The same word is
> used, but
> >two rigorous concepts are defined. One is mostly related to coding
> >theory, and the other is related to estimation theory. One
> deals mainly
> >with non numerical finite discrete distributions, and the other is
> >based on statistics from samples of parametrized family of
> >distributions. Even within the framework of information
> theory, there
> >are several definitions of information (e.g. see the last chapter of
> >Renyi's book on Probability Theory). This situation arises often in
> >mathematics: e.g., there are several concepts of "distance", and,
> >despite the basic axioms they all satisfy, nobody would say
> that they
> >have the same meaning, even when they are defined on a common space.
> >
> >Then, mathematical tools are potential (and sometimes demonstrated)
> >simplified models for physical phenomenons. On the other hand,
> >scientists may publish various definitions of information
> for physical
> >situations. It does not mean that any of these definitions should be
> >confused between themselves and confused with the
> mathematical ones. In
> >many papers, the authors insist on the analogies between their own
> >concepts and those previously published by other authors:
> this attitude
> >may convince the reviewers of the manuscript that the work has
> >interest, but contribute to the general confusion, particularly when
> >the confusing terms are recorded in the bibliographic databases.
> >Searching in databases with the keyword "information" would
> lead to a
> >considerable number of hits: nobody would try it without
> constraining
> >the search with other terms (did some of you tried?).
> >
> >We consider now the Entropy concepts. The two main ones are the
> >informational entropy and the thermodynamical entropy. The first one
> >has non ambiguous relations with information (in the sense of
> >information theory), since both are defined within the
> framework of a
> >common theory. Let us look now to the thermodynamical entropy, which
> >was defined by Rudolf Clausius in 1865. It is a physical concept,
> >usually introduced from the Carnot Cycle. The existence of
> entropy is
> >postulated, and it is a state function of the system. Usual
> variables
> >are temperature and pressure. Entropy calculations are
> sometimes made
> >discarding the implicit assumptions done for an idealized
> Carnot Cycle.
> >Here come difficulties. E.g., the whole universe is sometimes
> >considered as a system for which the the entropy is assumed to have
> >sense. Does the equilibrium of such a system has sense? Does
> >thermodynamical state functions make sense here? And what
> about "the"
> >temperature? These latter variable, even when viewed as a
> function of
> >coordinates and/or time, has sense only for a restricted number of
> >situations. These difficulties appear for many other
> systems. At other
> >scales, they may appear for microscopic systems, and for macroscopic
> >systems unrelated to thermochemistry. In fact, what is often
> implicitly
> >postulated is that the thermodynamical entropy theory could work
> >outside thermodynamics.
> >
> >Statistical mechanics creates a bridge between microscopic and
> >macroscopic models, as evidenced from the work of Boltzmann.
> These two
> >models are different. One is a mathematical model for an idealized
> >physical situation (punctual balls, elastic collisions,
> distribution of
> >states, etc..), and the other is a simplified physical
> model, working
> >upon a restricted number of conditions. The expression of
> the entropy,
> >calculated via statistical mechanics methods, is formally similar to
> >the informational entropy. This latter has appeared many
> decades after
> >the former. Thus, the pioneers of information theory (Shannon, von
> >Neumann) who retain the term "entropy", are undoubtly responsible of
> >the historical link between <<Entropy>> and <<Information>>
> (e.g. see
> > <http://www.bartleby.com/64/C004/024.html>
http://www.bartleby.com/64/C004/024.html).
> >
> >Although "entropy" is a well known term in information
> theory, and used
> >coherently with the term "information" in this area, the
> situation is
> >different in science. I do not know what is "information" in
> >theermodynamics (does anybody know?). However, "chemical
> information"
> >is a well known area of chemistry, which covers many topics,
> including
> >data mining in chemical data bases. In fact, chemical
> information was
> >reognized as a major field when the ACS decided in 1975 to
> rename one
> >of its journals "Journal of Chemical Information and Computer
> >Sciences": it was previously named the "Journal of Chemical
> >Documentation". There are little papers in this journal which are
> >connected with entropy (thermodunamical of informational).
> An example
> >is the 1996 paper of Shu-Kun Lin, relating entropy with
> similarity and
> >symmetry. Similarity is itself a major area in chemical information,
> >but I consider that the main area of chemical information is
> related to
> >chemical databases, such that the chemical information is
> represented
> >by the nodes and edges graph associated to a structural formula.
> >Actually, mathematical tools able to work on this kind of chemical
> >information are lacking, particulary for statistics (did anyone
> >performed statistics on graphs?).
> >
> >In 1999, the links between information sciences and entropy
> were again
> >recognized, when Shu-Kun lin created the open access journal
> "Entropy":
> ><<An International and Interdisciplinary Journal of Entropy and
> >Information Studies>>. Although most pluridisciplinary
> journals are at
> >the intersection of two areas, Shu-Kun Lin is a pionneer in
> the field
> >of transdisciplinarity, permitting the publication in a
> single journal
> >of works related to entropy and/or information theory,
> originating from
> >mathematics, physics, chemistry, biology, economy, and philosophy.
> >
> >The concept of information exists in other sciences for
> which the term
> >entropy is used. Bioinformation is a major concept in
> bioinformatics,
> >for which I am not specialist. Thus I hope that Pedro Marijuan would
> >like to help us to understand what are the links between
> bioinformation
> >and entropy. Entropy and information are known from economists and
> >philosophers. I also hope they add their voice to those of
> scientists
> >and mathematicians, to enlight our discussions during the session.
> >
> >Now I would like to draw some provocative conclusions. Analogies
> >between concepts or between formal expressions of quantities
> are useful
> >for the spirit, for the quality of the papers, and sometimes
> they are
> >used by modellers to demonstrate why their work merit funds (does
> >anybody never do that?). The number of new concepts in sciences
> >(includes mathematics, economy, humanities, and so on) is
> increasing,
> >and new terms are picked in our natural language: the task of the
> >teachers becomes harder and harder. Entropy and Information are like
> >the "fourth dimension", one century ago: they offer in common the
> >ability to provide exciting topics to discuss.
> Unfortunately, Entropy
> >and Information are much more difficult to handle.
> >
> >Michel Petitjean Email:
> petitjean@itodys.jussieu.fr
> >Editor-in-Chief of Entropy entropy@mdpi.org
> >ITODYS (CNRS, UMR 7086) ptitjean@ccr.jussieu.fr
> >1 rue Guy de la Brosse Phone: +33 (0)1 44 27 48 57
> >75005 Paris, France. FAX : +33 (0)1 44 27 68 14
> > <http://www.mdpi.net> http://www.mdpi.net
<http://www.mdpi.org> http://www.mdpi.org
> > <http://petitjeanmichel.free.fr/itoweb.petitjean.html>
http://petitjeanmichel.free.fr/itoweb.petitjean.html
> > <http://petitjeanmichel.free.fr/itoweb.petitjean.freeware.html>
http://petitjeanmichel.free.fr/itoweb.petitjean.freeware.html
> >_______________________________________________
> >fis mailing list
> >[email protected] <http://webmail.unizar.es/mailman/listinfo/fis>
http://webmail.unizar.es/mailman/listinfo/fis
> >
> >
> >
> >
> >
>
> --
> Dr. Shu-Kun Lin
> Molecular Diversity Preservation International (MDPI)
> Matthaeusstrasse 11, CH-4057 Basel, Switzerland Tel. +41 61
> 683 7734 (office) Tel. +41 79 322 3379 (mobile) Fax +41 61 302 8918
> E-mail: lin@mdpi.org
> <http://www.mdpi.org/lin/> http://www.mdpi.org/lin/
>
>
>
> _______________________________________________
> fis mailing list
> [email protected] <http://webmail.unizar.es/mailman/listinfo/fis>
http://webmail.unizar.es/mailman/listinfo/fis
>
Received on Mon Apr 5 17:41:09 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET