Re: why not high complexity?

2001-05-30 Thread juergen


 O O ??? - There is no way of assigning equal
  OO O O O   nonvanishing probability to infinitely
   O O O O   many mathematical structures, each being
 O O O   represented by a finite set of axioms.
  OO O O O   
 O   okay - strictly speaking, you are correct. but a
  OOOcommon trick is to compute equal-probabilities
 O   on finite subsets of the infinite set. and then
 O O O   you can take the limit as those subsets grow
 O O O   to the size of the infinite set.
OOO  O   
  OOOthe growing here is important - very often the
  O OO   O   order in which you add members to the set change
  O      how the series converges. but for the case of
  O OOO  expected complexity, it does not.

but in the limit uniform probabilities vanish. Maybe you'd like to
write down formally what you mean. 




Re: why not high complexity?

2001-05-30 Thread scerir

[EMAIL PROTECTED] wrote:

 Even if He completely ignores runtime, He still cannot assign high
 probability to irregular universes with long minimal descriptions.

Lee Smolin wrote about some Darwinian super-selection rule, among
trees of universes. Do you think there is a possible connection?

He also wrote: we will have to extend the Darwinian idea that 
the structure of a system [universe] must be formed from within 
by natural processes of self-organization - to the properties 
of space and time themselves.

The key step in this somewhat Darwinian argument is that the 
parameters determining the shape of the new cosmos must resemble 
those of its parent, rather than being picked at random. 
Granted this, after a while most universes in the mega-cosmos will
cluster around certain apparently arbitrary values - the kind that produce
observers like us. wrote Damien Broderick.

- Scerir









Re: why not high complexity?

2001-05-30 Thread juergen



 From: Karl Stiefvater [EMAIL PROTECTED]
 Date: Mon, 28 May 2001 00:11:33 -0500
 
  O   OO OO  Max Tegmark suggests that .. all mathematical
  O O    structures are a priori given equal statistical
 OOOO O  weight and Jurgen Schmidhuber counters that
  O  O  OOO  there is no way of assigning nonvanishing
OOO Oprobability to all (infinitely many)
 OO  O   mathematical structures 

Not quite - the text in Algorithmic Theories of Everything says
... of assigning nonvanishing _equal_ probability ...

 and he then goes on
  O  (i think) to assign a weighting based upon
   OO   OOO  time-complexity.

Most of the weightings discussed in Algorithmic Theories of Everything
completely ignore time, except for one: the speed prior S derived from
the assumption that the universe-generating process is not only computable
but also optimally efficient.

Concerning time-independent weightings: Different computing devices
(traditional Turing  Machines, Enumerable Output Machines, General Turing
Machines) reflect different degrees of computability (traditional monotone
computability, enumerability, computability in the limit). This causes
various weightings. All favor short descriptions, given the device.

 O O  OOOi have to say i find Tegmark's argument more
 OO   O  persuasive - i can't see why the great
   O  O  programmer should be worried about runtime.

Even if He completely ignores runtime, He still cannot assign high
probability to irregular universes with long minimal descriptions.

 O O OO  furthermore, i feel intuitively that the
   O O   universes ought to have equal weight.

Some intuitively feel the sun revolves around the earth.

  OO   OOsuch a sort of probability can be defined, of
 O  O O  course, by taking the limit as finite subsets
  O  approach the full infinite set. as long as we
 OO OOO  get the same answer regardless of the order in
 O O O   which we grow the subset, the limit can be said
 OOO O O O   to be defined.

??? - There is no way of assigning equal nonvanishing probability to
infinitely many mathematical structures, each being represented by a
finite set of axioms.

Maybe the intention is to assign measure 2^-n to all histories of size n.
That would imply our environment will dissolve into randomness right now,
because almost all continuations of its rather regular history so far
are random. But instead the universe keeps following the nonrandom
traditional laws of physics, thus offering evidence against this measure.

 O O O   the problem is - such a view predicts that we
 O  O OO live in a universe of high Kolmogorov complexity
 OO   OO O   - not low complexity.
 O  O
 O OO O  but i don't see why this is such a surprise
  O O- living in such a universe, we ought to see
OO OO O  events occur which we cannot effectively
 O OOO  O O  predict. but that is exactly what we do see.

Practical unpredictability due to deterministic chaos and Heisenberg etc
is very different from true unpredictability. For instance, despite of
chaos and uncertainty principle my computer probably will not disappear
within the next hour. But in most possible futures it won't even see the
next instant - most are maximally random and unpredictable.  Any irregular
future, however, must have small measure, given the rather harmless
assumption of formal describability or computability in the limit.

http://www.idsia.ch/~juergen/everything/html.html
http://www.idsia.ch/~juergen/toesv2/