We generalize Kolmogorov complexity and study the fastest way of
computing all computable objects, with consequences for inductive
inference, Occam's razor, and the big picture.

Algorithmic Theories of Everything
Juergen Schmidhuber                        http://www.idsia.ch/~juergen

The probability distribution P from which the history of our universe is
sampled represents a theory of everything or TOE. We assume P is formally
describable. Since most (uncountably many) distributions are not, this
imposes a strong inductive bias. We show that P(x) is small for any
universe x lacking a short description, and study the spectrum of TOEs
spanned by two Ps: P1 reflects the most compact constructive descriptions,
P2 the fastest way of computing all computable objects. P1 requires
generalizations of traditional computability, Solomonoff's algorithmic
probability, and Kolmogorov complexity; it leads to describable objects
more random than Chaitin's "number of wisdom" Omega. P2 derives from
Levin's universal search in program space and a natural resource-oriented
postulate: the cumulative prior probability of all x incomputable within
time t by this optimal algorithm should be 1/t. Between P1 and P2 we find
a universal cumulatively enumerable measure that dominates traditional
enumerable measures; any such CEM must assign low probability to any
universe lacking a short enumerating program. We derive P-specific
consequences for observers evolving in computable universes, inductive
reasoning, quantum physics, philosophy, and the expected duration of
our universe.
                                         10 theorems, 50 pages, 100 refs

TR IDSIA-20-00, Version 2.0, 20 Dec 2000; quant-ph/0011122
Various formats (html, ps, dvi, pdf, tex) to be found in
http://www.idsia.ch/~juergen/toesv2/

PS: we have job openings for two postdocs and one PhD student; please see
http://www.idsia.ch/jobs.html

Reply via email to