RE: [Fis] FIS / introductory text / 5 April 2004

From: Loet Leydesdorff <loet@leydesdorff.net>
Date: Tue 13 Apr 2004 - 19:40:36 CEST

> As Shu-Kun pointed out, entropy and information are
> opposites, and we would do well not to confound them. It is
> possible to measure entropy purely in terms of energetic
> changes. In fact, the variables of thermodynamics are all
> naturally skewed towards the energetic. Most constraints
> enter thermo via external boundaries (although some
> constraints are embedded to an extent in state variables like
> the Gibbs free energy.)
>
> Information, on the other hand, is dominated by constraint.
> (In fact, I argued earlier that information could be
> understood purely in terms of
> constraint.) Not that it is always necessary to know the
> details of the constraints explicitly. Just as one can
> phenomenologically measure thermodynamical variables in
> ignorance of molecular movements, one can also gauge the
> level of information according to their effects and in
> abstraction from knowing exactly how the constraints are working.

Dear Bob,

Do you mean to say here that the information is equal to the redundancy?
If not, how would you quantify/operationalize information? (Just a
question.)

With kind regards,

Loet

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 13 19:44:58 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET