Re: [Fis] A definition of Information (II)

From: Viktoras Didziulis <[email protected]>
Date: Thu 04 Mar 2004 - 22:14:49 CET

Dear Stan, Pedro, group
 
One of the fathers of Cybernetic science R. Ashby defined information based
on both Shannon's definition of probabilistic entropy [-1*SUM(p*log(p))] and
Wiener's definition of amount of information [SUM(p*log(p))] as 'that which
removes uncertainty'. Both Shannon and Wiener suggested to measure amount of
information by the amount of uncertainty (probabilistic entropy) it removes
from a system... Obeying conclusions made on this list one has probably to
add that in order to be able to remove the uncertainty, information has to
be meaningful. In this case meaningful information should be understood as
one side of information/probabilistic entropy or information/uncertainty
duality of opposites the same way as hot/cold, light/dark, white/black, +/-
top/down dualities are. Direct relation between probabilistic entropy and
thermodynamic entropy is still undefined in terms of mathematics and formal
logic. Therefore before saying information is equivalent to enthalpy one has
to prove that uncertainty (probabilistic entropy) is equivalent to
thermodynamic entropy (I will get back to this in yet another post).
 
Regarding 'constraints as covenants' among cells... What is constraint ?
Formally it is a relation between several sets, that occurs when the variety
(in bits=log2(x)) that exists under one condition is less than the variety
that exists under other. In other words presence of any invariant over a
set of phenomena implies a constraint. Constraints also describe systems not
just as a product of our imagination but as real facts - objects as joints
of nature or technology. Let's take a chair (an example from Ashby's book)
Movement of any free object in 3D Euclidean space has 6 degrees of freedom
Parts of a chair unconnected each would have its own 6 degrees of freedom.
Thus for legs have 24 degrees of freedom. However, after they are joined
into a system (a chair) they will have only 6 degrees of freedom as a single
object. Therefore the essence of a system is unity, rather than just a
collection of independent parts what corresponds to the presence of
constraints. In case of an organic tissue (population of cells) constraints
are not limited to geometrical space, but are projected on to a
multidimensional functional space which can be described as a variety of
independent functions the cell is able to perform... Another axiom of
cybernetics (decay of variety) also states that as time progresses the
variety can not increase and will eventually diminish as system will become
stabilized. Then as time goes both an observer's uncertainty about system
s state and uncertainty within a system itself can only diminish. So any
constraints in the light of the cybernetic definition of information as
that which removes uncertainty' may indeed be regarded as 'covenants'.
Amount of information within a system also decreases with decreasing
variety. And this amount of information actually is the part which is not
meaningful (anymore). So one can conclude that meaningful information is
that which removes meaningless information from a system leading all parts
of the system to synchronized functioning by synchronizing internal states
of its elements'. This phenomenon was named 'the law of Experience' by Ashby
 
 
Let's try to look for more parallels between the second law of
thermodynamics and probabilistic entropy... It is clear that because of the
2nd law any system arriving to certain stability eventually will also
minimize entropy related loss of energy which is system's response to and
not the process defined by the 2nd law. This is yet another law to oppose
entropy - law of energy loss minimization. Thus thermodynamically we have a
duality of energy loss vs. energy optimization. I would not dare to think
that one of them is more important than the other, rather I would think they
have to be in equilibrium. When the 'weights' move to one or another side
because of changed constraints - system becomes unstable and tries to
re-stabilize itself and thus to reach that equilibrium again by arriving to
a new state stable both internally and externally. However if we take a life
cycle of any system we will see that there is a period when system's entropy
and variety indeed increase. In natural biological systems it is related
with growth, in technology it is related with production of a system from
parts - making a chair or a microchip for example. This is because
construction of any new system is based on increasing consumption and
destruction of other systems which may be considered as total
destabilization of those and which eventually leads to a temporal increase
of both thermodynamic and probabilistic entropies. But in any case system
ends up reaching equilibrium state with minimized variety, amount of
information and the both entropies. However most living and technical
systems eventually die, while life time of other systems increases moving
further apart from the center of complexity hierarchy/levels. Life time of
bodies is longer than that of cells, molecules or ecosystems live longer
than bodies, that of atoms or stars/planets is much longer and life time of
elementary particles or galaxies is very much longer than those. But it is
the life who overcomes this life-time problem by process known as
reproduction.
 
Best regards
Viktoras
 
 

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Thu Mar 4 12:10:45 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET