[Fis] the religion of probability

From: Karl Javorszky <[email protected]>
Date: Tue 20 Apr 2004 - 16:14:50 CEST

Michel,

thank you for the invitation. I hope Pedro will not censure this too
heavily, as I already gave my suggestions on the most probable state of a
set less than a week ago.

I look at a homeostatic system. It is neither dead (not changing its
parameters at all) nor mechanical (each and every step is predictable). It
has a momentary state (a temporal cross-section of describing it) and a
sequence of changes (a temporally longitudinal way of describing it) until
it achieves that state again which is more or less that same state it
started with.
Two questions arise:
a) how does the momentary state correspond to the sequence of steps until
the cycle or period is closed again;
b) how do states t1 (at the beginning of our description) and t2 (at the
end) match relative to each other?
ad b) In this phase of the FIS discussion I understand the main interest to
occupy with identity among two states of a system - how they are alike,
resemble each other or are identical (more or less). Like we consider two
processes and states of breathing (in and out, lungs filled or empty) as
extrema. Point b) discusses that one who has breathed out again is in the
same state as at the beginning of this cycle. We disregard at this approach
the he has become older and wiser and more tired and hungry or refreshed and
satisfied if he happens to sleep or to eat during our observation.
The terms of reference need to be agreed on so that we can compare the
continuity, similarity, homogenity or identity of two states. Both states
deviate in minor respects to the idealised, average ("entropic") state of
being completely full or empty of air. Each individual moment is in some
respects deviating to the idealised state (like any living family is in
deviation to the average, statistical family [which e.g. has 1.83 children
and earns 1216.5 Euros a month and lives in a flat of 66.3 m2]. No one
fulfills these expected values to the dot.)

ad a) The concept of an idealised (most probable, entropic) state reappears
in the temporal journey between beginning and end of a period in the life of
a system too. There, in the analogy one would suppose the existence of an
ideally filled lung, of the ideal moment between breathing in and breathing
out, when the organism is ideally supplied. One sees that this concept is
a-biologic. Biology is the story about changes that lead to an identical
state.
Now, looking at a set to the elements of which arrive changes that cause the
state of the set to deviate and then to re-approach a collection of
measurement values (those we started out with), one idealises and simplifies
and formalises until one feels solid ground by having landed on natural
numbers.
The most common-sense, neutral and non-controversial set is that of the set
N. If we can't agree on 1,2,3,... then we can't agree on anything.
So we discuss natural numbers and their properties as descriptions of the
fragmentational state of a set. We use additions. The addition is a
description of the fragmentational state of a set.
This is a very basic approach. We break things up and look in how many
pieces the thing can be broken into and how the pieces are alike each other.
We observe that the number of pieces a set is most usually in (the number of
summands in an addition) and the size of the set (the result of the
addition) are of course interdependent (the bigger something is the more
pieces it can broken up into) but by far not so linear as one would have
thought. Following this chain of thought one comes to "types" or
"archetypes" of logical constants, out of which one can easily build highly
complex systems.

Why is this number theoretical excursion relevant? Because the
fragmentational state of a set is a temporal cross-section, happens in the
moment. This we can compare to the sequence of steps it takes to arrive back
to the state we started out with. Then we understand the concepts of
"consistency", "concludent", "resulting", "dependent", "predictable",
"probable" and "certain".
You see that we use terminology of probability theory to describe basic
characteristics of a scientific argument or thesis: consistent, concludent,
resulting, dependent, predictable, probable and certain.

Our thinking classifies our thoughts on probabilistic terms: "this is a very
unlikely event" is a polite way of saying "nonsense". "This is against all
experience" puts all its faith into the consistency (predictability) of a
world outside and an observer inside that are both generating and collecting
matches betweeen expectations and realisations (I know what happens if I do
this). Normally, we expect our expectations to come true. We call something
chaos if we have no expectation and cannot predict what happens next. We
call something boring, monotonous, anancastic or stereotypical if it is too
much predictable, if we know at the onset what shall happen and when.

Entropy is fascinating because it has so many forms. We ourselves are so
much in a rhythmic, periodic pattern that we do overlook the periodic,
cyclic nature of the system oscillating around or playing with entropy. I
believe that the key to understanding entropy lies beneath the aspect "how
is it now" (contemporal) and "when shall it be again like this" (sequential
description).

I do like to discourse (pontificate) in great length and detail about
natural numbers and their relations among each other. This is a surprisingly
vivid lot. If you want to know more about how sets behave if (while) they
are linearised, do visit the homepage www.enumeration.net - it is in
shambles, but one can download 3 articles to the subject.

Thanks for your interest.
Karl
-----Urspr�ngliche Nachricht-----
Von: fis-bounces@listas.unizar.es [mailto:fis-bounces@listas.unizar.es]Im
Auftrag von Michel Petitjean
Gesendet: Dienstag, 20. April 2004 13:27
An: fis@listas.unizar.es
Betreff: [Fis] Re: Data, observations and distributions

To: <fis@listas.unizar.es>
Subj: [Fis] Re: Data, observations and distributions

Dear Karl,

"Karl Javorszky" <javorszky@eunet.at> wrote:
> Yes, I believe I may count under your term "probabilist".
> Yes, arriba probability!
> Karl

Great! Your opinion could enlight our discussions. What do you think ?

Michel Petitjean Email: petitjean@itodys.jussieu.fr
Editor-in-Chief of Entropy entropy@mdpi.org
ITODYS (CNRS, UMR 7086) ptitjean@ccr.jussieu.fr
1 rue Guy de la Brosse Phone: +33 (0)1 44 27 48 57
75005 Paris, France. FAX : +33 (0)1 44 27 68 14
http://www.mdpi.net http://www.mdpi.org
http://petitjeanmichel.free.fr/itoweb.petitjean.html
http://petitjeanmichel.free.fr/itoweb.petitjean.freeware.html
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis
Received on Tue Apr 20 16:24:57 2004

This archive was generated by hypermail 2.1.8 : Mon 07 Mar 2005 - 10:24:46 CET