On Tue, Aug 28, 2012 at 2:57 PM, benjayk <benjamin.jaku...@googlemail.com>wrote:

>
> It seems that the Church-Turing thesis, that states that an universal
> turing
> machine can compute everything that is intuitively computable, has near
> universal acceptance among computer scientists.
>
> I really wonder why this is so, given that there are simple cases where we
> can compute something that an abitrary turing machine can not compute using
> a notion of computation that is not extraordinary at all (and quite
> relevant
> in reality).
> For example, given you have a universal turing machine A that uses the
> alphabet {1,0} and a universal turing machine B that uses the alphabet
> {-1,0,1}.
> Now it is quite clear that the machine A cannot directly answer any
> questions that relates to -1. For example it cannot directly compute
> -1*-1=1. Machine A can only be used to use an encoded input value and
> encoded description of machine B, and give an output that is correct given
> the right decoding scheme.
>

1's or 0's, X's or O's, what the symbols are don't have any bearing on what
they can compute.

Consider: No physical computer today uses 1's or 0's, they use voltages,
collections of more or fewer electrons.

This doesn't mean that our computers can only directly compute what
electrons do.

But for me this already makes clear that machine A is less computationally
> powerful than machine B. Its input and output when emulating B do only make
> sense with respect to what the machine B does if we already know what
> machine B does, and if it is known how we chose to reflect this in the
> input
> of machine A (and the interpretation of its output). Otherwise we have no
> way of even saying whether it emulates something, or whether it is just
> doing a particular computation on the alphabet {1,0}.
>

These are rather convincing:
http://en.wikipedia.org/wiki/Video_game_console_emulator

There is software that emulates the unique architectures of an Atari,
Nintendo, Supernintendo, PlayStation, etc. systems.  These emulators can
also run on any computer, whether its Intel X86, x86_64, PowerPC, etc.  You
will have a convincing experience of playing an old Atari game like space
invaders, even though the original creators of that program never intended
it to run on a computer architecture that wouldn't be invented for another
30 years, and the original programmers didn't have to be called in to
re-write their program to do so.


> I realize that it all comes down to the notion of computation. But why do
> most choose to use such a weak notion of computation? How does machine B
> not
> compute something that A doesn't by any reasonable standard?
> Saying that A can compute what B computes is like saying that "orange" can
> express the same as the word "apple", because we can encode the word
> "apple"
> as "orange".


System A (using its own language of representation for system A), can
predict exactly all future states of another system B (and vice versa).  A
and B have different symbols, states, instructions, etc., so perhaps this
is why you think system A can't perfectly emulate system B, but this is a
little like saying there are things that can only be described by Spanish
speakers that no other language (French, English, etc.) could describe.
 Sure, a translation needs to occur to communicate a Spanish idea into an
English one, but just because spanish and english speakers use a different
language doesn't mean there are problems only speakers of one language can
solve.


> It is true in a very limited sense, but it seems mad to treat
> it as the foundation of what it means for words to express something (and
> the same goes for computation).
> If we use such trivial notions of computation, why not say that the program
> "return input" emulates all turing-machines because given the right input
> it
> gives the right output (we just give it the solution as input).
>

Many programs have no input and/or no output, but they still can be
rightfully said to perform different computations.


> I get that we can simply use the Church-turing as the definition of
> computation means. But why is it (mostly) treated as being the one and only
> correct notion of computation (especially in a computer science context)?


I think it more comes into play in the a definition of a universal machine,
than in the definition of computation.

It is useful because it makes it easy to prove.  All you need to do is show
how some machine can be used to emulate any other known turning universal
machine.


>

The only explanation I have is that it is dogma. To question it would change
> to much and would be too "complicated" and uncomfortable. It would make
> computation an irreducibly complex and relative notion or - heaven forbid -
> even an inherently subjective notion (computation from which perspective?).


Reverse engineering machine language code is very difficult, but there are
automated programs for doing this that can provide much more readable
program code.  Code too, can be difficult to grasp, (see
http://www.ioccc.org/ for example), but in other cases, code is easy to
understand.  I often prefer a snippet of example code to a notation-heavy
mathematical formula.

Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to