RE: [Fis] Data, observations and distributions / was: Re: Probabilistic Entropy

From: Loet Leydesdorff <[email protected]>
Date: Mon 19 Apr 2004 - 14:56:50 CEST

Dear Michel,
 
I appreciate your position, but it seems to me that "observed data" can
only be considered as "experimental data" if they can be specified with
reference to "expectations". Otherwise, the interpretation would be
naturalistic instead of reflexive, and one would no longer be able to
discuss the quality of the data. But I don't wish to deny the special
epistemological status of data as codifiers of the communication. After
the experiment one cannot deny the value of the measurements any longer.
 
The basis of experimental data, however, is not given, but constructed
in scientific reasoning. I am not denying that in the context of
discovery, one may experience this the other way round. However, upon
reflection the data never speak for themselves, but appear within a
context of expectations as validations.
 
The issue itself is an old one in the philosophy of science, but it was
somewhat obscured by positivism and empiricism. Perhaps, it leads us too
far away from information theory. (We don't have to agree on
philosophical issues!)
 
Let me provide a quote from Huygens about the issue:
 
"Against Cartesius's dogma that the nature or notion of a body should
consist in extension alone, I have a notion of space that differs from
the notion of a body: space is what may be occupied by a body."
 
Eventually, the data (given in the Revelation) remain unknown from a
scientific perspective. What we have, are the measurement results which
gain meaning only from a critical distance (embedded in discourse). We
can hypothesize (Pierce's) "firstness" or (Leibniz') "vis viva" from
this perspective. I agree that entertaining this hypothesis can be most
useful for the scientific understanding.
 
Reflexivity about the status of the data, however, is most important
when we study meaning-processing systems. These systems generate
probabilistic entropy to a much larger extent then thermodynamic
entropy. Thus, their "existence" can no longer be based on data. The
measurements are then knowledge-based. With hindsight, this reflection
has these epistemological implications for other scientific discourses
(like physics) as well. I have noted that some physicists write "natural
laws" nowadays between quotation marks.
 
With kind regards,
 
 
Loet
  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/

 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society

> -----Original Message-----
> From: fis-bounces@listas.unizar.es
> [mailto:fis-bounces@listas.unizar.es] On Behalf Of Michel Petitjean
> Sent: Monday, April 19, 2004 12:58 PM
> To: fis@listas.unizar.es
> Subject: [Fis] Data, observations and distributions / was:
> Re: Probabilistic Entropy
>
>
> To: <fis@listas.unizar.es>
> Subj: Data, observations and distributions / was: Re:
> Probabilistic Entropy
>
> Dear Loet,
>
> I see what you mean when you write:
> > The data are
> > measurement results which can be considered as probability
> > distributions. Thus, the expected information content of these
> > distributions and the meaning which these are given in (highly
> > codified) discourses provide the basis of science.
>
> But I do not agree. Is there any probabilist confusing the
> empirical distribution defined from a sample, and the values
> themselves ? (are there probabilists among FISers ?). Would
> you confuse a distribution with its observations, for a
> continuous distribution (e.g. a gaussian) ? Surely no. Same
> thing for a Poisson distribution (infinite discrete). The
> confusion arises in the finite discrete case. In Probability,
> the distribution associated to a random
> variable X exists, discarding if observations are made or
> not. For a finite discrete r.v. taking equiprobable values
> such that P(X=xi)=1/N, we still have no observations, unless
> we perform the "experiment", and even if getting a sample of
> size N of X, we are not ensured to observe exactly one time
> each value xi (may be x1 appears twice, may be x2 is not
> observed,...). The empirical distribution based on data is a
> special case, in which the N observed values (I assume the
> usual euclidean case) are input in the definition of a
> probability law and its distribution, for which each value
> has probability 1/N. And even here, we cannot say that the N
> data "are" a sample of the empirical law, even if observing
> such a sample has a certain probability to occur. The data
> are measured, then the empirical distribution is built. The
> distributions exist, in general, outside any experiment (in
> the probabilistic sense). Once the experiment is done, we
> have observations (which merit together the name of "sample"
> under some conditions). Now returning to science, and having,
> say N data xi (e.g. points in R^d): they are not observations
> of any probability law: building a r.v. X with a distribution
> such that P(X=xi)=1/N, is just a step in assigning a
> mathematical model to the physical phenomenon from which the
> N data were measured. And most time, the empirical
> distribution has little interest by itself: the modeller will
> look if the N data, considered as if they are a sample of
> some parent population, let him to know something about the
> parent distribution (may be gaussian, may be anything) and
> its associated parameters. Information is attached to a
> distribution, with or without any probabilistic or physical
> experiment. Measures are just physical data. The relations
> between data and distributions exist in the spirit of the
> modeller. I would say that the probabilistic entropy H, when
> it exists, is just a parameter of a distribution, as the
> mean, the median or the extreme values.
>
> Michel Petitjean Email:
> petitjean@itodys.jussieu.fr
> Editor-in-Chief of Entropy entropy@mdpi.org
> ITODYS (CNRS, UMR 7086) ptitjean@ccr.jussieu.fr
> 1 rue Guy de la Brosse Phone: +33 (0)1 44 27 48 57
> 75005 Paris, France. FAX : +33 (0)1 44 27 68 14
> http://www.mdpi.net http://www.mdpi.org
> http://petitjeanmichel.free.fr/itoweb.petitjean.html
> http://petitjeanmichel.free.fr/itoweb.petitjean.freeware.html
> _______________________________________________
> fis mailing list
> [email protected] http://webmail.unizar.es/mailman/listinfo/fis
>
Received on Mon Apr 19 15:08:11 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET