On Wed, Dec 20, 2000 at 04:32:47PM +0100, [EMAIL PROTECTED] wrote:
> It's all in Section 6. Please read 6.1 to get the basic idea, read 6.2
> to understand why Levin Search and FAST are optimal. FAST computes the
> n-th bit of each universe x as quickly as the fastest algorithm for x
> (save for a constant factor). Your alternative does not.
I don't see how that can be true. If x is a random string, FAST will
compute it in 2^l(x) steps, whereas the fastest algorithm will compute it
in l(x) steps. If you have a more precise definition, please point me to
it because I couldn't find it in any of those sections.
> Just like the "infinitely accelerating computer" discussed here earlier.
> But you might have noticed that the paper does not assume the existence
> of TMs more powerful than general TMs (quantum physics does not require
> those). Duplicating forever leads to uncountable sets, which we reject
> for lack of evidence they are necessary to explain the world - they make
> the explanation much more complicated than necessary.
Unlike the infinitely accelerating computer, the duplicating computer
cannot solve the halting problem. It's no more powerful than general TMs,
only faster (and only faster for problems that can be solved in parallel).
If you don't like the duplicating computer, you would have to choose
between a classical TM and a quantum TM. The latter may be able to solve
some problems (for example factoring) exponentially faster. Choosing a
classical TM would lead to the conclusion that building a practical
quantum computer is impossible in our universe, so the choice has real
consequences, yet I don't see how you can make it on a priori grounds.