Re: [Fis] Entropy and Information

From: Igor Rojdestvenski <[email protected]>
Date: Tue 06 Apr 2004 - 20:33:47 CEST

Just some short remarks in addition to previous post. Please take these
statements as suggestions for discussion only, despite quite categoric
style.

21. Information and entropy are opposite to each other with two conditions:

1) when the "dictionary", or (by a timely Shu Kun-Lin's offline remark) a
number of microstates, does not change. I.e., information increases if a
bias towards certain microstates increases
2) in increments only, not as absolute values

22. Maybe, information increase is the "message" increase with conserved
dictionary, while entropy increase is connected somehow to dictionary
increase? Formulation is possibly wrong, but, I think, deserves some
elaboration.

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 6 20:38:18 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET