At 16:39 06/05/04 -0400, Stephen Paul King wrote:
Daer Bruno and George,

    At the risk of being massively naive, does this idea seem to be related
to the infamous problem of Boltzmann's Stosszahlansatz?

http://www.lns.cornell.edu/spr/1999-02/msg0014388.html
http://philsci-archive.pitt.edu/archive/00001244/01/Winsberg_laws_and__statmech.doc

    My reasoning is that in order to figure out how do define a universal
prior (or probability measure for the initiona conditions that led
inevitably to our common world of experience) we need to understand how to
define a ration of worlds like ours to all possible worlds, or the
computational equivalent: algorithms that generate worlds like ours as a
subset of the collection of all possible algorithms.


I do believe that worlds generated by an algorithm have a null measure (it is
a reason for not believing in "Classical mechanics" or any singular reality).
What I would call "worlds" are emergent psychological constructs linked to an
infinite set of running algorithm.
I would not use the expression "universal prior" in this context (unless
you really talk about Schmidhuber like prior, but then I refer you to older post,
where I show that if such prior exists they should be derived from comp, not
imposed at the start).
Boltzmann's Stosszahlansatz ? I don't know yet. A priori I would say that
classical form of indeterminacy (like deterministic chaos) is based (with comp)
to algorithmic complexity. Quantum indeterminacy is based on
consistent self-multiplication. They are quite different form of uncertainty.
And you know my (pedagogical) problem: to say more we should go through
that logical barrier just to interpret correctly what the machine (G) and its guardian
angel (G*) *can* tell us ...


Bruno



http://iridia.ulb.ac.be/~marchal/



Reply via email to