Re: neuro: and what else

From: John Collier <[email protected]>
Date: Mon 10 Apr 2000 - 16:57:19 CEST

<x-flowed>At 01:54 PM 10/04/00 +0200, you wrote:

>Dear Peter,
>
>Many thanks for your very well crafted questions. In what follows I have
>tried to produce some limited answers:
>
> >
> >I went through on the issues of Trends in
> >Neurosciences 1998-1999, and checked the titles of the papers, where the
> >word "INFORMATION" was explicitly used.:
> >
>
>In general, these titles use the INFO term in the conventional (artificial)
>sense of the info processing machines. In most cases it is almost correct,
>but overall it becomes a big mistake (I agree with Allan's comments in his
>posting). One opens the door to the "functionalist" scheme and forgets the
>differential characteristics of the living neuron: I would particularly
>emphasize the unceasing self-production-degradation activities at the
>molecular level (in every synapsis too) unknown to the stable artificial
>machines.

In 1988 I did a search on Biological Abstracts for information, entropy,
and information and entropy. I got several hundred for the first, about 70
for the second, and 40 or so for the combination. I looked at the abstracts,
and in all but two cases the use of "information" or "entropy" was gratuitous,
i.e., the abstract would not have lost any content had they been deleted.

Things may have changed in more than 10 years, but I somehow doubt
it. The restriction of information to information processing, especially
by machines suggest use of the machines, but not in the theory. If it
is theoretical, I agree the tendency is disturbing. It tends towards mechanical
models.

Cheers,
John

John Collier pljdc@alinga.newcastle.edu.au
Department of Philosophy http://bcollier.newcastle.edu.au
University of Newcastle, NSW 2308 AUSTRALIA
http://www.newcastle.edu.au/department/pl/Staff/JohnCollier/collier.html
Received on Tue Apr 11 07:59:05 2000

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:45 CET