RE: [Fis] cell signaling: COMMUNICATION

From: Loet Leydesdorff <[email protected]>
Date: Tue 07 Oct 2003 - 09:23:23 CEST

> -----Original Message-----
> From: JLRChandler [ <mailto:JLRChand@pop.erols.com>
mailto:JLRChand@pop.erols.com]
> Sent: Saturday, October 04, 2003 10:53 PM
> To: Loet Leydesdorff (by way of "Pedro C. Mariju��n"
> <marijuan@posta.unizar.es>)
> Subject: RE: [Fis] cell signaling: COMMUNICATION
>
>
>
> Loet:
>
> You write:
>
> "The specification of the system of reference may then be orthogonal
> to the "natural" one as the above example illustrates. I can work
> this out for chemical systems, but I haven't done it for (molecular)
> biological ones."
>
> Please explicate what you mean. This could be a very novel
> contribution to theoretical chemistry.
> It would be helpful if you could illustrate your example with a
> specific chemical reaction so that your meaning is clear.
>
> Cheers
>
> Jerry LR Chandler
>

Dear Jerry and colleagues (cc to FIS),

I take the liberty to answer your question at the level of the list
because this may be interesting also for others involved in this
discusion.

Whenever a distribution is changed, a probabilistic entropy is
generated, since a chemical reaction changes the distribution of atoms
among molecules. For example, in the case of:

    NaOH + HCl �� NaCl + H2O

Na first co-occurs with oxygen and hydrogen, but on the right side of
the equation it co-occurs with Cl. Thus, the distributions at the atomic
level are changed. In this case, four elements are involved: Na, O, H,
and Cl. A probabilistic entropy is thus generated in these four
dimensions and in terms of their interactions.

The probability distribution of each the four elements provides us with
the margin totals of a four-dimensional array of probability
distributions. For example, the probability of Na and Cl cooccurring in
the two dimensions of Na and Cl can be formulated as follows:

        p(NaCl) = [NaCl]/ ([NaOH] + [NaCl]) + [NaCl]/ ([HCl +
[NaCl])

One can input these probabilities into the Shannon formulas and thus
compute the expected information content of the distributions and the
transmissions among the four dimensions. This provides a statistics very
different from normal chemistry because it is a information theoretical
representation of the chemistry under study.

I would be interested if you could further illuminate me (and us) on the
relevance of these ideas for (theoretical) chemistry because that goes
beyond my knowledge. Particularly interesting may be the transmission in
more than two dimensions because this indicator can sometimes be
negative. In this case a negative entropy is generated at the next-order
network level. I have elaborated on this possibility in my paper ��The
Mutual Information of <http://www.leydesdorff.net/th4>
University-Industry-Government Relations: An Indicator of the Triple
Helix Dynamics,�� Scientometrics 58(2) (2003), 445-467. <pdf-version
<http://www.leydesdorff.net/th4/T(uig).pdf> > for sociologically
relevant distributions.

With kind regards,

Loet

  _____

Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
 <mailto:loet@leydesdorff.net> loet@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/

 
 <http://www.upublish.com/books/leydesdorff-sci.htm> The Challenge of
Scientometrics ; <http://www.upublish.com/books/leydesdorff.htm> The
Self-Organization of the Knowledge-Based Society

 
Received on Tue Oct 7 09:25:41 2003

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET