RE: SV: [Fis] Re: What is the definition of information ?

RE: SV: [Fis] Re: What is the definition of information ?

From: Julio Stern <[email protected]>
Date: Wed 31 Aug 2005 - 21:48:11 CEST

I think I CAN agree with this one:

>information is 'a difference that makes a difference'.
(e-mail from: Søren Brier)

In fact, using the notation below,
the amount of information in observation X,
relative to the prior p_0, is the prior-posterior
``distance'' D(p_0, p_n)

There are several choices for the ``distance'' D( , )
A good one the the relative entropy,
(or Kullbach-Leibler Divergence)
I(p_n,p_0)= \int_\Theta \log(p_n/p_0) p_n d\theta

OBS1: I(p_n,p_0)=0 ONLY IF p_n=p_0

OBS2: One common critique to this point of view
is that this measure of information in the observation X
is RELATIVE to the prior, p_0 .
As a good Bayesian however,
I think that is exactly as it should be...

*****************************************

I work with Bayesian Statistics, where the main operation is of the form

  p_n(\theta) \propto p_{0}(\theta) L(\theta | X) ,

where:
- \theta is the (vector) parameter of the statistical model,
- p_0 and p_n are the prior and posterior distributions
  of the parameter \theta, and
- L(\theta | X) is the likelihood function of \theta,
  given the observed data X=[x_1,x_2,...x_n]

Now: It is possible that the observation X is such that
the uncertainty* of p_n is greater than that of p_0
(* entropy, variance, or whatever you want)
Still, the information in X is statistically relevant!

-- Julio Stern

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Wed Aug 31 21:47:59 2005


This archive was generated by hypermail 2.1.8 on Wed 31 Aug 2005 - 21:47:59 CEST