Re: [Fis] Re: What is information ?Re: [Fis] Re: What is information ?
From: John Collier <collierj@ukzn.ac.za>
Date: Wed 14 Sep 2005 - 16:01:34 CEST Well said. I have often given the same response when people challenge me in talks to give a definition of information. At 03:25 PM 2005/09/14, Hans C. von Baeyer wrote: Thank you, Marcin, for your caution against trying to find THE definition of What we need more than a common law is a common currency. Any sort of energy can be converted into any other (sometimes with a large cost). Here is a nested categorization from my paper in Entropy http://www.mdpi.org/entropy/papers/e5020100.pdf The most liberal and inclusive view is the �It From Bit� view. It has originated independently from so many people that it is pointless to attribute an origin, though it probably goes back to Leibniz�s view that the world has a logical structure in terms of sensations based in the ability to discriminate. The term is due to John Wheeler, and the view has recently been powerfully if controversially championed by Stephen Wolfram. On this view, any causally grounded distinction makes a difference. It might be called a God�s eye perspective, or the view from nowhere. On this view information is objective, and there is nothing else. The negentropy view of information is a restriction on the It From Bit view. Only those Its that are capable of doing work (organizing and using energy, or for sorting things) count as information. The rest is disorder. This view is due to Schr�dinger [41], though the groundwork was done by Szillard. The motivation for this view is that work is required for control, and the information in microstates beyond that in macrostates is hidden from view. Negentropy measures the capacity for control (in bits, the number of discriminations that a system can make). The next view is a restriction of the negentropic approach to particular levels of a physical hierarchy, so that information is relativized to a cohesive level of an object, such as an organism or a species. The view is due to Brooks and Wiley [4, 46], Collier [11] and Smith [43]. The idea is that not all negentropy is expressed at a given level, and the �Its� available are level relative. This information is a measure of the constraints on the objects within the level; because of their connection to biological and cognitive form, Collier [13] calls this expressed information enformation to distinguish it from other forms of negentropy (for example, disordered information due to nonequilibrium conditions is sometimes called intropy ). Lyla Gatlin [28] called this information stored information , but this name is somewhat misleading, as it does not reflect its dynamical and often active nature. This sort of information will be the focus of this paper. Restricting further we have functional information, which is the expressed information that is functional. I will address this sort of information in this paper, but I will have little to say about it. Some functional information is meaningful. The nature of meaning is the great object of desire for information theory. Within the scope of meaningful, or semantic information, is intentional information, or cognitive content. At the next level of restriction is social information, though some authors hold that cognitive content depends on language, which is a social activity. I will not discuss these levels further here, which is not to say that they are unimportant, or are in some sense reducible to the information forms that I do discuss. In general, each nest is harder to produce from the previous, but is produced from it and other resources. I talk only of information in its bound form in which the complexions are of physical states. Of course there are lots of discussions of information that ignore this sort of discipline (and for which it is in many case premature). However I think it should be a goal, or unification is not going to be possible except perhaps by mathematical formalism, which is no unification at all. So rather than a single definition of information, I suggest we work more towards a unification of the theory of information, otherwise there will be no science of information as such. This discussion has taken an interesting turn. John Professor John
Collier
collierj@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041
South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
http://www.nu.ac.za/undphil/collier/index.html
--------------------------------------------------------------------
Please find our disclaimer at http://www.ukzn.ac.za/disclaimer
--------------------------------------------------------------------
<<<
|
This archive was generated by hypermail 2.1.8 on Wed 14 Sep 2005 - 16:00:04 CEST