Wei Dai wrote:
>
> Thanks for clarifying the provability issue. I think I understand and
> agree with you.
>
> On Tue, Nov 13, 2001 at 12:05:22PM +0100, Juergen Schmidhuber wrote:
> > What about exploitation? Once you suspect you found the PRG you can use
> > it
> > to predict the future. Unfortunately the prediction will take enormous
> > time to stabilize, and you never can be sure it's finished.
> > So it's not very practical.
>
> By exploiting the fact that we're in an oracle universe I didn't mean
> using TMs to predict the oracle outputs. That is certainly impractical.
>
> There are a couple of things you could do though. One is to use some
> oracle outputs to predict other oracle outputs when the relationship
> between them is computable. The other, much more important, is to quickly
> solve arbitrarily hard computational problem using the oracles.
>
> > I prefer the additional resource assumptions reflected
> > by the Speed Prior.  They make the oracle universes very unlikely, and
> > yield computable predictions.
>
> Why do you prefer the Speed Prior? Under the Speed Prior, oracle universes
> are not just very unlikely, they have probability 0, right? Suppose one
> day we actually find an oracle for the halting problem, or even just find
> out that there is more computing power in our universe than is needed to
> explain our intelligence. Would you then (1) give up the Speed Prior and
> adopt a more dominant prior, or (2) would you say that you've encountered
> an extremely unlikely event (i.e. more likely you're hallucinating)?
>
> If you answer (1) then why not adopt the more dominant prior now?

You are right in the sense that under the Speed Prior any complete
_infinite_ oracle universe has probability 0. On the other hand,
any _finite_ beginning of an oracle universe has nonvanishing
probability. Why? The fastest algorithm for computing all universes
computes _all_ finite beginnings of all universes. Now oracle universes
occasionally require effectively random bits.  History beginnings that
require n such bits come out very slowly: the computation of the n-th
oracle bit requires more than O(2^n) steps, even by the fastest
algorithm.
Therefore, under the Speed Prior the probabilities of oracle-based
prefixes quickly converge towards zero with growing prefix size. But in
the finite case there always will remain some tiny epsilon.

Why not adopt a more dominant prior now? I just go for the simplest
explanation consistent with available data, where my simplicity measure
reflects what computer scientists find simple: things that are easy
to compute.


Juergen Schmidhuber

http://www.idsia.ch/~juergen/
http://www.idsia.ch/~juergen/everything/html.html
http://www.idsia.ch/~juergen/toesv2/

Reply via email to