Re: [Fis] Re: What is information ?

Re: [Fis] Re: What is information ?

From: John Collier <[email protected]>
Date: Wed 14 Sep 2005 - 16:01:34 CEST
Well said. I have often given the same response when people challenge me in talks to give a definition of information.

At 03:25 PM 2005/09/14, Hans C. von Baeyer wrote:
Thank you, Marcin, for your caution against trying to find THE definition of
information!  I tend to think in analogies, and since I'm a physicist, they
often come from physics.  In this case I have already introduced the analogy
to energy, and I'd like to take it a step further.

Everyone knows what energy is, but nobody knows what energy REALLY IS.
There is no simple definition in any textbook!  Instead, there are two
things:

1) There are dozens of specific and well-understood formulas for energy in
different circumstances (eg kinetic energy, potential gravitational energy,
rest mass energy, chemical energy etc)  Similarly, there could be many
specific measures or even definitions of information in different
circumstances.

2) There is a law of conservation of energy, which every one of the formulas
mentioned above fits into.  In the case of information, there seems to be a
hint of some kind of second law (eg Kahre's law of diminishing information,
the second law of thermodynamics... )  but nothing that I know of that's
crisp and universal.  So besides looking for definitions of information, we
ought to be looking also for laws that this information obeys, and that can
be usefully employed to make predictions or at least to clrify processes.

What we need more than a common law is a common currency. Any sort
of energy can be converted into any other (sometimes with a large cost).

Here is a nested categorization from my paper in Entropy
http://www.mdpi.org/entropy/papers/e5020100.pdf
The most liberal and inclusive view is the �It From Bit� view. It has originated independently from so many people that it is pointless to attribute an origin, though it probably goes back to Leibniz�s view that the world has a logical structure in terms of sensations based in the ability to discriminate. The term is due to John Wheeler, and the view has recently been powerfully if controversially championed by Stephen Wolfram. On this view, any causally grounded distinction makes a difference. It might be called a God�s eye perspective, or the view from nowhere. On this view information is objective, and there is nothing else.
The negentropy view of information is a restriction on the It From Bit view. Only those Its that are capable of doing work (organizing and using energy, or for sorting things) count as information. The rest is disorder. This view is due to Schr�dinger [41], though the groundwork was done by Szillard. The motivation for this view is that work is required for control, and the information in microstates beyond that in macrostates is hidden from view. Negentropy measures the capacity for control (in bits, the number of discriminations that a system can make).
The next view is a restriction of the negentropic approach to particular levels of a physical hierarchy, so that information is relativized to a cohesive level of an object, such as an organism or a species. The view is due to Brooks and Wiley [4, 46], Collier [11] and Smith [43]. The idea is that not all negentropy is expressed at a given level, and the �Its� available are level relative. This information is a measure of the constraints on the objects within the level; because of their connection to biological and cognitive form, Collier [13] calls this expressed information enformation to distinguish it from other forms of negentropy (for example, disordered information due to nonequilibrium conditions is sometimes called intropy ). Lyla Gatlin [28] called this information stored information , but this name is somewhat misleading, as it does not reflect its dynamical and often active nature. This sort of information will be the focus of this paper.
Restricting further we have functional information, which is the expressed information that is functional. I will address this sort of information in this paper, but I will have little to say about it. Some functional information is meaningful. The nature of meaning is the great object of desire for information theory. Within the scope of meaningful, or semantic information, is intentional information, or cognitive content. At the next level of restriction is social information, though some authors hold that cognitive content depends on language, which is a social activity. I will not discuss these levels further here, which is not to say that they are unimportant, or are in some sense reducible to the information forms that I do discuss.

In general, each nest is harder to produce from the previous, but is produced from it and other resources. I talk only of information in its bound form in which the complexions are of physical states. Of course there are lots of discussions of information that ignore this sort of discipline (and for which it is in many case premature). However I think it should be a goal, or unification is not going to be possible except perhaps by mathematical formalism, which is no unification at all.

So rather than a single definition of information, I suggest we work more towards a unification of the theory of information, otherwise there will be no science of information as such.

This discussion has taken an interesting turn.

John

Professor John Collier                                     [email protected]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292       F: +27 (31) 260 3031
http://www.nu.ac.za/undphil/collier/index.html -------------------------------------------------------------------- Please find our disclaimer at http://www.ukzn.ac.za/disclaimer -------------------------------------------------------------------- <<<>>> Received on Wed Sep 14 16:00:04 2005


This archive was generated by hypermail 2.1.8 on Wed 14 Sep 2005 - 16:00:04 CEST