RE: [Fis] 2004 FIS session: concluding comments

From: <[email protected]>
Date: Sat 19 Jun 2004 - 19:23:53 CEST

Pedro wrote:
> entropy as disorder (Boltzmann) and information as a
> 'flow' (Shannon).

I apologize for being anal retentive, but I must strongly disagree with a
view that Shannon entropy is about flow. Shannon did apply his entropy
primarily to Markov chain models, simple probabilistic models that describe
flow. However, the same entropy can be applied to an arbitrary probabilistic
model, temporal or non-temporal. The only requirement is that the model is a
probability mass function.

Some quantum physicists indeed reject Shannon entropy, such as A. Zeilinger.
Quantum information theory is currently a very intense field of
investigation, with the recent appearance of new journals. However, there
are voices that Shannon's entropy may be applicable also to quantum
information theory, e.g. Timpson's work
http://arxiv.org/abs/quant-ph/0112178 For an overview and comparison, I
recommend the nice review article by C. Adami "The Physics of Information"
http://arxiv.org/abs/quant-ph/0405005 Adami points out that the information
that is derived using von Neumann's entropy results in paradoxes (negative
information), probably because von Neumann's entropy doesn't abide by
Shannon's criteria.

As for Tsallis entropy -- it is said to be an approximation to the earlier
concept of Renyi entropy, see "On the Renyi entropy, Boltzmann Principle,
Levy and power-law distributions and Renyi parameter" by A. G. Bashkirov
http://arxiv.org/abs/cond-mat/0211685

Best regards,
                Aleks

--
mag. Aleks Jakulin
http://www.ailab.si/aleks/
Artificial Intelligence Laboratory, 
Faculty of Computer and Information Science, University of Ljubljana. 
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Sat Jun 19 19:26:03 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:47 CET