On 28 Aug 2012, at 21:57, benjayk wrote:

It seems that the Church-Turing thesis, that states that an universal turing machine can compute everything that is intuitively computable, has near
universal acceptance among computer scientists.

Yes indeed. I think there are two strong arguments for this.

The empirical one: all attempts to define the set of computable functions have led to the same class of functions, and this despite the quite independent path leading to the definitions (from Church lambda terms, Post production systems, von Neumann machine, billiard ball, combinators, cellular automata ... up to modular functor, quantum topologies, quantum computers, etc.).

The conceptual one: the class of computable functions is closed for the most transcendental operation in math: diagonalization. This is not the case for the notions of definability, provability, cardinality, etc.

I really wonder why this is so, given that there are simple cases where we can compute something that an abitrary turing machine can not compute using a notion of computation that is not extraordinary at all (and quite relevant
in reality).
For example, given you have a universal turing machine A that uses the
alphabet {1,0} and a universal turing machine B that uses the alphabet
Now it is quite clear that the machine A cannot directly answer any
questions that relates to -1. For example it cannot directly compute
-1*-1=1. Machine A can only be used to use an encoded input value and
encoded description of machine B, and give an output that is correct given
the right decoding scheme.
But for me this already makes clear that machine A is less computationally
powerful than machine B.

Church thesis concerns only the class of computable functions. The alphabet used by the Turing machine, having 1, 2, or enumerable alphabet does not change the class. If you dovetail on the works of 1 letter Turing machine, you will unavoidably emulate all Turing machines on all finite and enumerable letters alphabets. This can be proved. Nor does the number of tapes, and/or parallelism change that class. Of course, some machine can be very inefficient, but this, by definition, does not concern Church thesis.

There was a thesis, often attributed to Cook (but I met him and he claims it is not his thesis), that all Turing machine can emulate themselves in polynomial time. This will plausibly be refuted by the existence of quantum computers (unless P = NP, or things like that). It is an open problem, but most scientists believe that in general a classical computer cannot emulate an arbitrary quantum computer in polynomial time. But I insist, quantum computer have not violated the Church Turing Post Markov thesis.

Its input and output when emulating B do only make
sense with respect to what the machine B does if we already know what
machine B does, and if it is known how we chose to reflect this in the input of machine A (and the interpretation of its output). Otherwise we have no way of even saying whether it emulates something, or whether it is just
doing a particular computation on the alphabet {1,0}.
I realize that it all comes down to the notion of computation. But why do most choose to use such a weak notion of computation? How does machine B not
compute something that A doesn't by any reasonable standard?
Saying that A can compute what B computes is like saying that "orange" can express the same as the word "apple", because we can encode the word "apple" as "orange". It is true in a very limited sense, but it seems mad to treat it as the foundation of what it means for words to express something (and
the same goes for computation).
If we use such trivial notions of computation, why not say that the program "return input" emulates all turing-machines because given the right input it
gives the right output (we just give it the solution as input).
I get that we can simply use the Church-turing as the definition of
computation means. But why is it (mostly) treated as being the one and only correct notion of computation (especially in a computer science context)? The only explanation I have is that it is dogma. To question it would change to much and would be too "complicated" and uncomfortable. It would make computation an irreducibly complex and relative notion or - heaven forbid - even an inherently subjective notion (computation from which perspective?).

That was what everybody believed before the rise of the universal machine and lambda calculus. Gödel called the closure of the computable functions for diagonalization a "miracle", and he took time before assessing it. See:

DAVIS M., 1982, Why Gödel Didn't Have Church's Thesis, Information and Control
54,.pp. 3-24.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to