Re: Why the Church-Turing thesis?

2012-09-12 Thread benjayk


Quentin Anciaux-2 wrote:
 
 2012/9/11 Quentin Anciaux allco...@gmail.com
 


 2012/9/11 benjayk benjamin.jaku...@googlemail.com



 Quentin Anciaux-2 wrote:
 
  2012/9/11 benjayk benjamin.jaku...@googlemail.com
 
 
 
  Quentin Anciaux-2 wrote:
  
   2012/9/11 benjayk benjamin.jaku...@googlemail.com
  
  
  
   Quentin Anciaux-2 wrote:
   
2012/9/10 benjayk benjamin.jaku...@googlemail.com
   
   
   
  No program can determine its hardware.  This is a
 consequence
  of
   the
  Church
  Turing thesis.  The particular machine at the lowest level
 has
  no
 bearing
  (from the program's perspective).
 If that is true, we can show that CT must be false, because
 we
  *can*
 define
 a meta-program that has access to (part of) its own
 hardware
   (which
 still
 is intuitively computable - we can even implement it on a
  computer).

   
It's false, the program *can't* know that the hardware it has
  access
   to
is
the *real* hardware and not a simulated hardware. The program
 has
  only
access to hardware through IO, and it can't tell (as never
 ever)
  from
that
interface if what's outside is the *real* outside or simulated
   outside.
\quote
Yes that is true. If anything it is true because the hardware
 is
  not
   even
clearly determined at the base level (quantum uncertainty).
I should have expressed myself more accurately and written 
   hardware
   
or
relative 'hardware'. We can define a (meta-)programs that
 have
   access
to
their hardware in the sense of knowing what they are running
 on
relative
to some notion of hardware. They cannot be emulated using
  universal
turing
machines
   
   
Then it's not a program if it can't run on a universal turing
  machine.
   
   The funny thing is, it *can* run on a universal turing machine.
 Just
  that
   it
   may lose relative correctness if we do that.
  
  
   Then you must be wrong... I don't understand your point. If it's a
  program
   it has access to the outside through IO, hence it is impossible
 for a
   program to differentiate real outside from simulated outside...
 It's
  a
   simple fact, so either you're wrong or what you're describing is
 not
 a
   program, not an algorithm and not a computation.
  OK, it depends on what you mean by program. If you presume that a
  program
  can't access its hardware,
 
 
  I *do not presume it*... it's a *fact*.
 
 
 Well, I presented a model of a program that can do that (on some level,
 not
 on the level of physical hardware), and is a program in the most
 fundamental
 way (doing step-by-step execution of instructions).
 All you need is a program hierarchy where some programs have access to
 programs that are below them in the hierarchy (which are the hardware
 though not the *hardware*).


 What's your point ? How the simulated hardware would fail ? It's
 impossible, so until you're clearer (your point is totally fuzzy), I
 stick
 to you must be wrong.

 
 So either you assume some kind of oracle device, in this case, the thing
 you describe is no more a program, but a program + an oracle, the oracle
 obviously is not simulable on a turing machine, or an infinite regress of
 level.
 
 
The simulated hardware can't fail in the model, just like a turing machine
can't fail. Of course in reality it can fail, that is beside the point.

You are right, my explanation is not that clear, because it is a quite
subtle thing.

Maybe I shouldn't have used the word hardware. The point is just that we
can define (meta-)programs that have access to some aspect of programs that
are below it on the program hierarchy (normal programs), that can't be
accessed by the program themselves. They can't emulated in general, because
sometimes the emulating program will necessarily emulate the wrong level
(because it can't correctly emulate that the meta-program is accessing what
it is *actually* doing on the most fundamental level).
They still are programs in the most fundamental sense.

They don't require oracles or something else that is hard to actually use,
they just have to know something about the hierarchy and the programs
involved (which programs or kind of programs are above or below it) and have
access to the states of other programs. Both are perfectly implementable on
a normal computer. They can even be implemented on a turing machine, but not
in general. They can also be simulated on turing machines, just not
necessarily correctly (the turing machine may incorrectly ignore which level
it is operating on relative to the meta-program).

We can still argue that these aren't programs in every sense but I think
what is executable on a normal computer can be rightfully called program.

benayk
-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34423089.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message 

Re: Why the Church-Turing thesis?

2012-09-12 Thread Quentin Anciaux
2012/9/12 benjayk benjamin.jaku...@googlemail.com



 Quentin Anciaux-2 wrote:
 
  2012/9/11 Quentin Anciaux allco...@gmail.com
 
 
 
  2012/9/11 benjayk benjamin.jaku...@googlemail.com
 
 
 
  Quentin Anciaux-2 wrote:
  
   2012/9/11 benjayk benjamin.jaku...@googlemail.com
  
  
  
   Quentin Anciaux-2 wrote:
   
2012/9/11 benjayk benjamin.jaku...@googlemail.com
   
   
   
Quentin Anciaux-2 wrote:

 2012/9/10 benjayk benjamin.jaku...@googlemail.com



   No program can determine its hardware.  This is a
  consequence
   of
the
   Church
   Turing thesis.  The particular machine at the lowest level
  has
   no
  bearing
   (from the program's perspective).
  If that is true, we can show that CT must be false, because
  we
   *can*
  define
  a meta-program that has access to (part of) its own
  hardware
(which
  still
  is intuitively computable - we can even implement it on a
   computer).
 

 It's false, the program *can't* know that the hardware it has
   access
to
 is
 the *real* hardware and not a simulated hardware. The program
  has
   only
 access to hardware through IO, and it can't tell (as never
  ever)
   from
 that
 interface if what's outside is the *real* outside or simulated
outside.
 \quote
 Yes that is true. If anything it is true because the hardware
  is
   not
even
 clearly determined at the base level (quantum uncertainty).
 I should have expressed myself more accurately and written 
hardware

 or
 relative 'hardware'. We can define a (meta-)programs that
  have
access
 to
 their hardware in the sense of knowing what they are running
  on
 relative
 to some notion of hardware. They cannot be emulated using
   universal
 turing
 machines


 Then it's not a program if it can't run on a universal turing
   machine.

The funny thing is, it *can* run on a universal turing machine.
  Just
   that
it
may lose relative correctness if we do that.
   
   
Then you must be wrong... I don't understand your point. If it's a
   program
it has access to the outside through IO, hence it is impossible
  for a
program to differentiate real outside from simulated outside...
  It's
   a
simple fact, so either you're wrong or what you're describing is
  not
  a
program, not an algorithm and not a computation.
   OK, it depends on what you mean by program. If you presume that a
   program
   can't access its hardware,
  
  
   I *do not presume it*... it's a *fact*.
  
  
  Well, I presented a model of a program that can do that (on some level,
  not
  on the level of physical hardware), and is a program in the most
  fundamental
  way (doing step-by-step execution of instructions).
  All you need is a program hierarchy where some programs have access to
  programs that are below them in the hierarchy (which are the hardware
  though not the *hardware*).
 
 
  What's your point ? How the simulated hardware would fail ? It's
  impossible, so until you're clearer (your point is totally fuzzy), I
  stick
  to you must be wrong.
 
 
  So either you assume some kind of oracle device, in this case, the
 thing
  you describe is no more a program, but a program + an oracle, the oracle
  obviously is not simulable on a turing machine, or an infinite regress of
  level.
 
 
 The simulated hardware can't fail in the model, just like a turing machine
 can't fail. Of course in reality it can fail, that is beside the point.

 You are right, my explanation is not that clear, because it is a quite
 subtle thing.

 Maybe I shouldn't have used the word hardware. The point is just that we
 can define (meta-)programs that have access to some aspect of programs that
 are below it on the program hierarchy (normal programs), that can't be
 accessed by the program themselves. They can't emulated in general, because
 sometimes the emulating program will necessarily emulate the wrong level
 (because it can't correctly emulate that the meta-program is accessing what
 it is *actually* doing on the most fundamental level).
 They still are programs in the most fundamental sense.

 They don't require oracles or something else that is hard to actually use,
 they just have to know something about the hierarchy and the programs
 involved (which programs or kind of programs are above or below it) and
 have
 access to the states of other programs. Both are perfectly implementable on
 a normal computer. They can even be implemented on a turing machine, but
 not
 in general. They can also be simulated on turing machines, just not
 necessarily correctly (the turing machine may incorrectly ignore which
 level
 it is operating on relative to the meta-program).


I still don't see why, what you describe is wishful thinking, or you're
wrong, or you can't explain correctly, what I understand from what you
write, is that you 

Re: Why the Church-Turing thesis?

2012-09-12 Thread Quentin Anciaux
2012/9/12 Quentin Anciaux allco...@gmail.com



 2012/9/12 benjayk benjamin.jaku...@googlemail.com



 Quentin Anciaux-2 wrote:
 
  2012/9/11 Quentin Anciaux allco...@gmail.com
 
 
 
  2012/9/11 benjayk benjamin.jaku...@googlemail.com
 
 
 
  Quentin Anciaux-2 wrote:
  
   2012/9/11 benjayk benjamin.jaku...@googlemail.com
  
  
  
   Quentin Anciaux-2 wrote:
   
2012/9/11 benjayk benjamin.jaku...@googlemail.com
   
   
   
Quentin Anciaux-2 wrote:

 2012/9/10 benjayk benjamin.jaku...@googlemail.com



   No program can determine its hardware.  This is a
  consequence
   of
the
   Church
   Turing thesis.  The particular machine at the lowest
 level
  has
   no
  bearing
   (from the program's perspective).
  If that is true, we can show that CT must be false, because
  we
   *can*
  define
  a meta-program that has access to (part of) its own
  hardware
(which
  still
  is intuitively computable - we can even implement it on a
   computer).
 

 It's false, the program *can't* know that the hardware it has
   access
to
 is
 the *real* hardware and not a simulated hardware. The program
  has
   only
 access to hardware through IO, and it can't tell (as never
  ever)
   from
 that
 interface if what's outside is the *real* outside or
 simulated
outside.
 \quote
 Yes that is true. If anything it is true because the hardware
  is
   not
even
 clearly determined at the base level (quantum uncertainty).
 I should have expressed myself more accurately and written 
hardware

 or
 relative 'hardware'. We can define a (meta-)programs that
  have
access
 to
 their hardware in the sense of knowing what they are
 running
  on
 relative
 to some notion of hardware. They cannot be emulated using
   universal
 turing
 machines


 Then it's not a program if it can't run on a universal turing
   machine.

The funny thing is, it *can* run on a universal turing machine.
  Just
   that
it
may lose relative correctness if we do that.
   
   
Then you must be wrong... I don't understand your point. If it's
 a
   program
it has access to the outside through IO, hence it is impossible
  for a
program to differentiate real outside from simulated outside...
  It's
   a
simple fact, so either you're wrong or what you're describing is
  not
  a
program, not an algorithm and not a computation.
   OK, it depends on what you mean by program. If you presume that a
   program
   can't access its hardware,
  
  
   I *do not presume it*... it's a *fact*.
  
  
  Well, I presented a model of a program that can do that (on some
 level,
  not
  on the level of physical hardware), and is a program in the most
  fundamental
  way (doing step-by-step execution of instructions).
  All you need is a program hierarchy where some programs have access to
  programs that are below them in the hierarchy (which are the
 hardware
  though not the *hardware*).
 
 
  What's your point ? How the simulated hardware would fail ? It's
  impossible, so until you're clearer (your point is totally fuzzy), I
  stick
  to you must be wrong.
 
 
  So either you assume some kind of oracle device, in this case, the
 thing
  you describe is no more a program, but a program + an oracle, the oracle
  obviously is not simulable on a turing machine, or an infinite regress
 of
  level.
 
 
 The simulated hardware can't fail in the model, just like a turing machine
 can't fail. Of course in reality it can fail, that is beside the point.

 You are right, my explanation is not that clear, because it is a quite
 subtle thing.

 Maybe I shouldn't have used the word hardware. The point is just that we
 can define (meta-)programs that have access to some aspect of programs
 that
 are below it on the program hierarchy (normal programs), that can't be
 accessed by the program themselves. They can't emulated in general,
 because
 sometimes the emulating program will necessarily emulate the wrong level
 (because it can't correctly emulate that the meta-program is accessing
 what
 it is *actually* doing on the most fundamental level).
 They still are programs in the most fundamental sense.

 They don't require oracles or something else that is hard to actually use,
 they just have to know something about the hierarchy and the programs
 involved (which programs or kind of programs are above or below it) and
 have
 access to the states of other programs. Both are perfectly implementable
 on
 a normal computer. They can even be implemented on a turing machine, but
 not
 in general. They can also be simulated on turing machines, just not
 necessarily correctly (the turing machine may incorrectly ignore which
 level
 it is operating on relative to the meta-program).


 I still don't see why, what you describe is wishful thinking, or you're
 wrong, or you can't explain 

Re: Why the Church-Turing thesis?

2012-09-11 Thread benjayk


Quentin Anciaux-2 wrote:
 
 2012/9/10 benjayk benjamin.jaku...@googlemail.com
 


   No program can determine its hardware.  This is a consequence of the
   Church
   Turing thesis.  The particular machine at the lowest level has no
  bearing
   (from the program's perspective).
  If that is true, we can show that CT must be false, because we *can*
  define
  a meta-program that has access to (part of) its own hardware (which
  still
  is intuitively computable - we can even implement it on a computer).
 

 It's false, the program *can't* know that the hardware it has access to
 is
 the *real* hardware and not a simulated hardware. The program has only
 access to hardware through IO, and it can't tell (as never ever) from
 that
 interface if what's outside is the *real* outside or simulated outside.
 \quote
 Yes that is true. If anything it is true because the hardware is not even
 clearly determined at the base level (quantum uncertainty).
 I should have expressed myself more accurately and written  hardware 
 or
 relative 'hardware'. We can define a (meta-)programs that have access
 to
 their hardware in the sense of knowing what they are running on
 relative
 to some notion of hardware. They cannot be emulated using universal
 turing
 machines
 
 
 Then it's not a program if it can't run on a universal turing machine.
 
The funny thing is, it *can* run on a universal turing machine. Just that it
may lose relative correctness if we do that. We can still use a turing
machine to run it and interpret what the result means.

So for all intents and purposes it is quite like a program. Maybe not a
program as such, OK, but it certainly can be used precisely in a
step-by-step manner, and I think this is what CT thesis means by
algorithmically computable.
Maybe not, but in this case CT is just a statement about specific forms of
algorithms.

-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34417440.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-11 Thread Quentin Anciaux
2012/9/11 benjayk benjamin.jaku...@googlemail.com



 Quentin Anciaux-2 wrote:
 
  2012/9/10 benjayk benjamin.jaku...@googlemail.com
 
 
 
No program can determine its hardware.  This is a consequence of the
Church
Turing thesis.  The particular machine at the lowest level has no
   bearing
(from the program's perspective).
   If that is true, we can show that CT must be false, because we *can*
   define
   a meta-program that has access to (part of) its own hardware (which
   still
   is intuitively computable - we can even implement it on a computer).
  
 
  It's false, the program *can't* know that the hardware it has access to
  is
  the *real* hardware and not a simulated hardware. The program has only
  access to hardware through IO, and it can't tell (as never ever) from
  that
  interface if what's outside is the *real* outside or simulated outside.
  \quote
  Yes that is true. If anything it is true because the hardware is not
 even
  clearly determined at the base level (quantum uncertainty).
  I should have expressed myself more accurately and written  hardware
 
  or
  relative 'hardware'. We can define a (meta-)programs that have access
  to
  their hardware in the sense of knowing what they are running on
  relative
  to some notion of hardware. They cannot be emulated using universal
  turing
  machines
 
 
  Then it's not a program if it can't run on a universal turing machine.
 
 The funny thing is, it *can* run on a universal turing machine. Just that
 it
 may lose relative correctness if we do that.


Then you must be wrong... I don't understand your point. If it's a program
it has access to the outside through IO, hence it is impossible for a
program to differentiate real outside from simulated outside... It's a
simple fact, so either you're wrong or what you're describing is not a
program, not an algorithm and not a computation.

Quentin


 We can still use a turing
 machine to run it and interpret what the result means.

 So for all intents and purposes it is quite like a program. Maybe not a
 program as such, OK, but it certainly can be used precisely in a
 step-by-step manner, and I think this is what CT thesis means by
 algorithmically computable.
 Maybe not, but in this case CT is just a statement about specific forms of
 algorithms.

 --
 View this message in context:
 http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34417440.html
 Sent from the Everything List mailing list archive at Nabble.com.

 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/everything-list?hl=en.




-- 
All those moments will be lost in time, like tears in rain.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-11 Thread benjayk


Quentin Anciaux-2 wrote:
 
 2012/9/11 benjayk benjamin.jaku...@googlemail.com
 


 Quentin Anciaux-2 wrote:
 
  2012/9/10 benjayk benjamin.jaku...@googlemail.com
 
 
 
No program can determine its hardware.  This is a consequence of
 the
Church
Turing thesis.  The particular machine at the lowest level has no
   bearing
(from the program's perspective).
   If that is true, we can show that CT must be false, because we *can*
   define
   a meta-program that has access to (part of) its own hardware
 (which
   still
   is intuitively computable - we can even implement it on a computer).
  
 
  It's false, the program *can't* know that the hardware it has access
 to
  is
  the *real* hardware and not a simulated hardware. The program has only
  access to hardware through IO, and it can't tell (as never ever) from
  that
  interface if what's outside is the *real* outside or simulated
 outside.
  \quote
  Yes that is true. If anything it is true because the hardware is not
 even
  clearly determined at the base level (quantum uncertainty).
  I should have expressed myself more accurately and written 
 hardware
 
  or
  relative 'hardware'. We can define a (meta-)programs that have
 access
  to
  their hardware in the sense of knowing what they are running on
  relative
  to some notion of hardware. They cannot be emulated using universal
  turing
  machines
 
 
  Then it's not a program if it can't run on a universal turing machine.
 
 The funny thing is, it *can* run on a universal turing machine. Just that
 it
 may lose relative correctness if we do that.
 
 
 Then you must be wrong... I don't understand your point. If it's a program
 it has access to the outside through IO, hence it is impossible for a
 program to differentiate real outside from simulated outside... It's a
 simple fact, so either you're wrong or what you're describing is not a
 program, not an algorithm and not a computation.
OK, it depends on what you mean by program. If you presume that a program
can't access its hardware, then what I am describing is indeed not a
program.

But most definitions don't preclude that. Carrying out instructions
precisely and step-by-step can be done with or without access to your
hardware.

Anyway, meta-programs can be instantiated using real computer (a program
can, in principle, know and utilize part of a more basic computational layer
if programmed correctly), so we at least know that real computers are beyond
turing machines.

benjayk

-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34417676.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-11 Thread Quentin Anciaux
2012/9/11 benjayk benjamin.jaku...@googlemail.com



 Quentin Anciaux-2 wrote:
 
  2012/9/11 benjayk benjamin.jaku...@googlemail.com
 
 
 
  Quentin Anciaux-2 wrote:
  
   2012/9/10 benjayk benjamin.jaku...@googlemail.com
  
  
  
 No program can determine its hardware.  This is a consequence of
  the
 Church
 Turing thesis.  The particular machine at the lowest level has no
bearing
 (from the program's perspective).
If that is true, we can show that CT must be false, because we
 *can*
define
a meta-program that has access to (part of) its own hardware
  (which
still
is intuitively computable - we can even implement it on a
 computer).
   
  
   It's false, the program *can't* know that the hardware it has access
  to
   is
   the *real* hardware and not a simulated hardware. The program has
 only
   access to hardware through IO, and it can't tell (as never ever) from
   that
   interface if what's outside is the *real* outside or simulated
  outside.
   \quote
   Yes that is true. If anything it is true because the hardware is not
  even
   clearly determined at the base level (quantum uncertainty).
   I should have expressed myself more accurately and written 
  hardware
  
   or
   relative 'hardware'. We can define a (meta-)programs that have
  access
   to
   their hardware in the sense of knowing what they are running on
   relative
   to some notion of hardware. They cannot be emulated using universal
   turing
   machines
  
  
   Then it's not a program if it can't run on a universal turing machine.
  
  The funny thing is, it *can* run on a universal turing machine. Just
 that
  it
  may lose relative correctness if we do that.
 
 
  Then you must be wrong... I don't understand your point. If it's a
 program
  it has access to the outside through IO, hence it is impossible for a
  program to differentiate real outside from simulated outside... It's a
  simple fact, so either you're wrong or what you're describing is not a
  program, not an algorithm and not a computation.
 OK, it depends on what you mean by program. If you presume that a program
 can't access its hardware,


I *do not presume it*... it's a *fact*.

Quentin


 then what I am describing is indeed not a
 program.

 But most definitions don't preclude that. Carrying out instructions
 precisely and step-by-step can be done with or without access to your
 hardware.

 Anyway, meta-programs can be instantiated using real computer (a program
 can, in principle, know and utilize part of a more basic computational
 layer
 if programmed correctly), so we at least know that real computers are
 beyond
 turing machines.

 benjayk

 --
 View this message in context:
 http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34417676.html
 Sent from the Everything List mailing list archive at Nabble.com.

 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/everything-list?hl=en.




-- 
All those moments will be lost in time, like tears in rain.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-11 Thread benjayk


Quentin Anciaux-2 wrote:
 
 2012/9/11 benjayk benjamin.jaku...@googlemail.com
 


 Quentin Anciaux-2 wrote:
 
  2012/9/11 benjayk benjamin.jaku...@googlemail.com
 
 
 
  Quentin Anciaux-2 wrote:
  
   2012/9/10 benjayk benjamin.jaku...@googlemail.com
  
  
  
 No program can determine its hardware.  This is a consequence
 of
  the
 Church
 Turing thesis.  The particular machine at the lowest level has
 no
bearing
 (from the program's perspective).
If that is true, we can show that CT must be false, because we
 *can*
define
a meta-program that has access to (part of) its own hardware
  (which
still
is intuitively computable - we can even implement it on a
 computer).
   
  
   It's false, the program *can't* know that the hardware it has
 access
  to
   is
   the *real* hardware and not a simulated hardware. The program has
 only
   access to hardware through IO, and it can't tell (as never ever)
 from
   that
   interface if what's outside is the *real* outside or simulated
  outside.
   \quote
   Yes that is true. If anything it is true because the hardware is
 not
  even
   clearly determined at the base level (quantum uncertainty).
   I should have expressed myself more accurately and written 
  hardware
  
   or
   relative 'hardware'. We can define a (meta-)programs that have
  access
   to
   their hardware in the sense of knowing what they are running on
   relative
   to some notion of hardware. They cannot be emulated using
 universal
   turing
   machines
  
  
   Then it's not a program if it can't run on a universal turing
 machine.
  
  The funny thing is, it *can* run on a universal turing machine. Just
 that
  it
  may lose relative correctness if we do that.
 
 
  Then you must be wrong... I don't understand your point. If it's a
 program
  it has access to the outside through IO, hence it is impossible for a
  program to differentiate real outside from simulated outside... It's
 a
  simple fact, so either you're wrong or what you're describing is not a
  program, not an algorithm and not a computation.
 OK, it depends on what you mean by program. If you presume that a
 program
 can't access its hardware,
 
 
 I *do not presume it*... it's a *fact*.
 
 
Well, I presented a model of a program that can do that (on some level, not
on the level of physical hardware), and is a program in the most fundamental
way (doing step-by-step execution of instructions).
All you need is a program hierarchy where some programs have access to
programs that are below them in the hierarchy (which are the hardware
though not the *hardware*).

So apparently it is not so much a fact about programs in a common sense way,
but about a narrow conception of what programs can be.

benjayk
-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34417762.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-11 Thread Quentin Anciaux
2012/9/11 benjayk benjamin.jaku...@googlemail.com



 Quentin Anciaux-2 wrote:
 
  2012/9/11 benjayk benjamin.jaku...@googlemail.com
 
 
 
  Quentin Anciaux-2 wrote:
  
   2012/9/11 benjayk benjamin.jaku...@googlemail.com
  
  
  
   Quentin Anciaux-2 wrote:
   
2012/9/10 benjayk benjamin.jaku...@googlemail.com
   
   
   
  No program can determine its hardware.  This is a consequence
  of
   the
  Church
  Turing thesis.  The particular machine at the lowest level has
  no
 bearing
  (from the program's perspective).
 If that is true, we can show that CT must be false, because we
  *can*
 define
 a meta-program that has access to (part of) its own hardware
   (which
 still
 is intuitively computable - we can even implement it on a
  computer).

   
It's false, the program *can't* know that the hardware it has
  access
   to
is
the *real* hardware and not a simulated hardware. The program has
  only
access to hardware through IO, and it can't tell (as never ever)
  from
that
interface if what's outside is the *real* outside or simulated
   outside.
\quote
Yes that is true. If anything it is true because the hardware is
  not
   even
clearly determined at the base level (quantum uncertainty).
I should have expressed myself more accurately and written 
   hardware
   
or
relative 'hardware'. We can define a (meta-)programs that have
   access
to
their hardware in the sense of knowing what they are running on
relative
to some notion of hardware. They cannot be emulated using
  universal
turing
machines
   
   
Then it's not a program if it can't run on a universal turing
  machine.
   
   The funny thing is, it *can* run on a universal turing machine. Just
  that
   it
   may lose relative correctness if we do that.
  
  
   Then you must be wrong... I don't understand your point. If it's a
  program
   it has access to the outside through IO, hence it is impossible for
 a
   program to differentiate real outside from simulated outside... It's
  a
   simple fact, so either you're wrong or what you're describing is not a
   program, not an algorithm and not a computation.
  OK, it depends on what you mean by program. If you presume that a
  program
  can't access its hardware,
 
 
  I *do not presume it*... it's a *fact*.
 
 
 Well, I presented a model of a program that can do that (on some level, not
 on the level of physical hardware), and is a program in the most
 fundamental
 way (doing step-by-step execution of instructions).
 All you need is a program hierarchy where some programs have access to
 programs that are below them in the hierarchy (which are the hardware
 though not the *hardware*).


What's your point ? How the simulated hardware would fail ? It's
impossible, so until you're clearer (your point is totally fuzzy), I stick
to you must be wrong.


 So apparently it is not so much a fact about programs in a common sense
 way,
 but about a narrow conception of what programs can be.

 benjayk
 --
 View this message in context:
 http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34417762.html
 Sent from the Everything List mailing list archive at Nabble.com.

 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/everything-list?hl=en.




-- 
All those moments will be lost in time, like tears in rain.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-11 Thread Quentin Anciaux
2012/9/11 Quentin Anciaux allco...@gmail.com



 2012/9/11 benjayk benjamin.jaku...@googlemail.com



 Quentin Anciaux-2 wrote:
 
  2012/9/11 benjayk benjamin.jaku...@googlemail.com
 
 
 
  Quentin Anciaux-2 wrote:
  
   2012/9/11 benjayk benjamin.jaku...@googlemail.com
  
  
  
   Quentin Anciaux-2 wrote:
   
2012/9/10 benjayk benjamin.jaku...@googlemail.com
   
   
   
  No program can determine its hardware.  This is a consequence
  of
   the
  Church
  Turing thesis.  The particular machine at the lowest level
 has
  no
 bearing
  (from the program's perspective).
 If that is true, we can show that CT must be false, because we
  *can*
 define
 a meta-program that has access to (part of) its own hardware
   (which
 still
 is intuitively computable - we can even implement it on a
  computer).

   
It's false, the program *can't* know that the hardware it has
  access
   to
is
the *real* hardware and not a simulated hardware. The program has
  only
access to hardware through IO, and it can't tell (as never ever)
  from
that
interface if what's outside is the *real* outside or simulated
   outside.
\quote
Yes that is true. If anything it is true because the hardware is
  not
   even
clearly determined at the base level (quantum uncertainty).
I should have expressed myself more accurately and written 
   hardware
   
or
relative 'hardware'. We can define a (meta-)programs that have
   access
to
their hardware in the sense of knowing what they are running on
relative
to some notion of hardware. They cannot be emulated using
  universal
turing
machines
   
   
Then it's not a program if it can't run on a universal turing
  machine.
   
   The funny thing is, it *can* run on a universal turing machine. Just
  that
   it
   may lose relative correctness if we do that.
  
  
   Then you must be wrong... I don't understand your point. If it's a
  program
   it has access to the outside through IO, hence it is impossible
 for a
   program to differentiate real outside from simulated outside...
 It's
  a
   simple fact, so either you're wrong or what you're describing is not
 a
   program, not an algorithm and not a computation.
  OK, it depends on what you mean by program. If you presume that a
  program
  can't access its hardware,
 
 
  I *do not presume it*... it's a *fact*.
 
 
 Well, I presented a model of a program that can do that (on some level,
 not
 on the level of physical hardware), and is a program in the most
 fundamental
 way (doing step-by-step execution of instructions).
 All you need is a program hierarchy where some programs have access to
 programs that are below them in the hierarchy (which are the hardware
 though not the *hardware*).


 What's your point ? How the simulated hardware would fail ? It's
 impossible, so until you're clearer (your point is totally fuzzy), I stick
 to you must be wrong.


So either you assume some kind of oracle device, in this case, the thing
you describe is no more a program, but a program + an oracle, the oracle
obviously is not simulable on a turing machine, or an infinite regress of
level.

Halting problem is not new, I still don't see your point or something new
here.

Quentin


 So apparently it is not so much a fact about programs in a common sense
 way,
 but about a narrow conception of what programs can be.

 benjayk
 --
 View this message in context:
 http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34417762.html
 Sent from the Everything List mailing list archive at Nabble.com.

 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/everything-list?hl=en.




 --
 All those moments will be lost in time, like tears in rain.




-- 
All those moments will be lost in time, like tears in rain.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-10 Thread benjayk


  No program can determine its hardware.  This is a consequence of the
  Church
  Turing thesis.  The particular machine at the lowest level has no
 bearing
  (from the program's perspective).
 If that is true, we can show that CT must be false, because we *can*
 define
 a meta-program that has access to (part of) its own hardware (which
 still
 is intuitively computable - we can even implement it on a computer).


It's false, the program *can't* know that the hardware it has access to is
the *real* hardware and not a simulated hardware. The program has only
access to hardware through IO, and it can't tell (as never ever) from that
interface if what's outside is the *real* outside or simulated outside.
\quote
Yes that is true. If anything it is true because the hardware is not even
clearly determined at the base level (quantum uncertainty).
I should have expressed myself more accurately and written  hardware  or
relative 'hardware'. We can define a (meta-)programs that have access to
their hardware in the sense of knowing what they are running on relative
to some notion of hardware. They cannot be emulated using universal turing
machines (in general - in specific instances, where the hardware is fixed on
the right level, they might be). They can be simulated, though, but in this
case the simulation may be incorrect in the given context and we have to put
it into the right context to see what it is actually emulating (not the
meta-program itself, just its behaviour relative to some other context). 
 
We can also define an infinite hierarchy of meta-meta--programs (n
metas) to show that there is no universal notion of computation at all.
There is always a notion of computation that is more powerful than the
current one, because it can reflect more deeply upon its own hardware.

See my post concerning meta-programs for further details.
-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34413719.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-10 Thread Quentin Anciaux
2012/9/10 benjayk benjamin.jaku...@googlemail.com



   No program can determine its hardware.  This is a consequence of the
   Church
   Turing thesis.  The particular machine at the lowest level has no
  bearing
   (from the program's perspective).
  If that is true, we can show that CT must be false, because we *can*
  define
  a meta-program that has access to (part of) its own hardware (which
  still
  is intuitively computable - we can even implement it on a computer).
 

 It's false, the program *can't* know that the hardware it has access to is
 the *real* hardware and not a simulated hardware. The program has only
 access to hardware through IO, and it can't tell (as never ever) from that
 interface if what's outside is the *real* outside or simulated outside.
 \quote
 Yes that is true. If anything it is true because the hardware is not even
 clearly determined at the base level (quantum uncertainty).
 I should have expressed myself more accurately and written  hardware 
 or
 relative 'hardware'. We can define a (meta-)programs that have access to
 their hardware in the sense of knowing what they are running on relative
 to some notion of hardware. They cannot be emulated using universal
 turing
 machines


Then it's not a program if it can't run on a universal turing machine.


 (in general - in specific instances, where the hardware is fixed on
 the right level, they might be). They can be simulated, though, but in this
 case the simulation may be incorrect in the given context and we have to
 put
 it into the right context to see what it is actually emulating (not the
 meta-program itself, just its behaviour relative to some other context).

 We can also define an infinite hierarchy of meta-meta--programs (n
 metas) to show that there is no universal notion of computation at all.
 There is always a notion of computation that is more powerful than the
 current one, because it can reflect more deeply upon its own hardware.

 See my post concerning meta-programs for further details.
 --
 View this message in context:
 http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34413719.html
 Sent from the Everything List mailing list archive at Nabble.com.

 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/everything-list?hl=en.




-- 
All those moments will be lost in time, like tears in rain.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-10 Thread Stephen P. King

On 9/10/2012 11:40 AM, benjayk wrote:



No program can determine its hardware.  This is a consequence of the
Church
Turing thesis.  The particular machine at the lowest level has no

bearing

(from the program's perspective).

If that is true, we can show that CT must be false, because we *can*
define
a meta-program that has access to (part of) its own hardware (which
still
is intuitively computable - we can even implement it on a computer).


It's false, the program *can't* know that the hardware it has access to is
the *real* hardware and not a simulated hardware. The program has only
access to hardware through IO, and it can't tell (as never ever) from that
interface if what's outside is the *real* outside or simulated outside.
\quote
Yes that is true. If anything it is true because the hardware is not even
clearly determined at the base level (quantum uncertainty).
I should have expressed myself more accurately and written  hardware  or
relative 'hardware'. We can define a (meta-)programs that have access to
their hardware in the sense of knowing what they are running on relative
to some notion of hardware. They cannot be emulated using universal turing
machines (in general - in specific instances, where the hardware is fixed on
the right level, they might be). They can be simulated, though, but in this
case the simulation may be incorrect in the given context and we have to put
it into the right context to see what it is actually emulating (not the
meta-program itself, just its behaviour relative to some other context).
  
We can also define an infinite hierarchy of meta-meta--programs (n

metas) to show that there is no universal notion of computation at all.
There is always a notion of computation that is more powerful than the
current one, because it can reflect more deeply upon its own hardware.

See my post concerning meta-programs for further details.

Dear benjayk,

Is there any means by which the resource requirements paly a role 
for a single program? No, because of this indeterminacy (the 1p 
indeterminacy) as Bruno has explained well. But while this is true, if 
you consider multiple computations that are accessing shared resources 
things are not so clear. I wish that some thought might be given to the 
problem of concurrency.


--
Onward!

Stephen

http://webpages.charter.net/stephenk1/Outlaw/Outlaw.html


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-08 Thread Bruno Marchal


On 07 Sep 2012, at 12:24, benjayk wrote:




Bruno Marchal wrote:



On 28 Aug 2012, at 21:57, benjayk wrote:



It seems that the Church-Turing thesis, that states that an
universal turing
machine can compute everything that is intuitively computable, has
near
universal acceptance among computer scientists.


Yes indeed. I think there are two strong arguments for this.

The empirical one: all attempts to define the set of computable
functions have led to the same class of functions, and this despite
the quite independent path leading to the definitions (from Church
lambda terms, Post production systems, von Neumann machine, billiard
ball, combinators, cellular automata ... up to modular functor,
quantum topologies, quantum computers, etc.).

OK, now I understand it better. Apparently if we express a  
computation in
terms of a computable function we can always arrive at the same  
computable

function using a different computation of an abitrary turing universal
machine. That seems right to me.

But in this case I don't get why it is often claimed that CT thesis  
claims
that all computations can be done by a universal turing machine, not  
merely
that they lead to the same class of computable functions (if  
converted

appriopiately).


This is due to a theorem applying to all universal programming  
language, or universal system. They all contain a universal machine.  
This makes CT equivalent with the thesis that there is a universal  
machine with respect to (intuitive) computability.


It entails also an intensional Church thesis. Not only all universal  
system can compute what each others can compute, but they can compute  
the function in the same manner. This comes from the fact that a game  
of life pattern (say) can exist and compute the universal function of  
some other universal system, like a lisp interpreter for example. This  
makes all result on computations working also on the notions of  
simulation and emulation.







The latter is a far weaker statement, since computable functions  
abstract

from many relevant things about the machine.

And even this weaker statement doesn't seem true with regards to more
powerful models like super-recursive functions, as computable  
functions just

give finite results, while super-recursive machine can give
infinite/unlimited results.


Computability concerns finite or infinite generable things.
Then you can weaken comp indeed, and many results remains valid, but  
are longer to prove. I use comp and numbers as it is easier.








Bruno Marchal wrote:


The conceptual one: the class of computable functions is closed for
the most transcendental operation in math: diagonalization. This is
not the case for the notions of definability, provability,
cardinality, etc.

I don't really know what this means. Do you mean that there are just
countable many computations? If yes, what has this do with whether all
universal turing machines are equivalent?


It means that the notion of computability, despite being epistemic, is  
well defined in math. It is the only such notion. The diagnonalization  
cannot been applied to find a new computable function already not in  
the class of the one given by any universal machine.
It makes comp far more explanatively close than any concept in math  
and physics.









Bruno Marchal wrote:




I really wonder why this is so, given that there are simple cases
where we
can compute something that an abitrary turing machine can not
compute using
a notion of computation that is not extraordinary at all (and quite
relevant
in reality).
For example, given you have a universal turing machine A that uses  
the
alphabet {1,0} and a universal turing machine B that uses the  
alphabet

{-1,0,1}.
Now it is quite clear that the machine A cannot directly answer any
questions that relates to -1. For example it cannot directly compute
-1*-1=1. Machine A can only be used to use an encoded input value  
and

encoded description of machine B, and give an output that is correct
given
the right decoding scheme.
But for me this already makes clear that machine A is less
computationally
powerful than machine B.


Church thesis concerns only the class of computable functions.
Hm, maybe the wikipedia article is a bad one, since it mentioned  
computable
functions just as means of explaining it, not as part of its  
definition.



Bruno Marchal wrote:


The  alphabet used by the Turing machine, having 1, 2, or enumerable
alphabet does not change the class. If you dovetail on the works of 1
letter Turing machine, you will unavoidably emulate all Turing
machines on all finite and enumerable letters alphabets. This can be
proved. Nor does the number of tapes, and/or  parallelism change that
class.
Of course, some machine can be very inefficient, but this, by
definition, does not concern Church thesis.
Even so, CT thesis makes a claim about the equivalence of machines,  
not of

emulability.


It does, by the intensional CT, which follows 

Re: Why the Church-Turing thesis?

2012-09-08 Thread benjayk

I just respond to some parts of your posts, because I'd rather discuss the
main points than get sidetracked with issues that are less fundamental.


Jason Resch-2 wrote:
 

 I admit that for numbers this is not so relevant because number relations
 can be quite clearly expressed using numerous symbols (they have very few
 and simple relations), but it is much more relevant for more complex
 relations.

 
 Complex relation can be expressed in terms of a series of interrelated
 simpler relations (addition, multiplication, comparison, etc.).  You are
 focused on the very lowest level and it is no wonder you cannot see the
 higher-level possibilities for meaning, relations, intelligence,
 consciousness, etc. that a machine can create.
The complex relations can often only be expressed as simple relations on a
meta-level (which is a very big step of abstraction). You can express
rational numbers using natural numbers, but only using an additional layer
of interpretation (which is a *huge* abstraction - it's the difference
between description and what is being described).

The natural numbers itself don't lead to the rational numbers (except by
adding additional relations, like the inverse of multiplication).


Jason Resch-2 wrote:
 
 The relation of hot vs. cold as experienced by you is also the
 production of a long series of multiplications, additions, comparisons,
 and
 other operations. 
You assume reductionism or emergentism here. Of course you can defend the CT
thesis if you assume that the lowest level can magically lead to higher
levels (or the higher levels are not real in the first place).
The problem is that this magic would precisely be the higher levels that you
wanted to derive.


Jason Resch-2 wrote:
 


 Jason Resch-2 wrote:
 
  For example it cannot directly compute
   -1*-1=1. Machine A can only be used to use an encoded input value
 and
   encoded description of machine B, and give an output that is
 correct
   given
   the right decoding scheme.
  
  
   1's or 0's, X's or O's, what the symbols are don't have any bearing
 on
   what
   they can compute.
  
  That's just an assertion of the belief I am trying to question here.
  In reality, it *does* matter which symbols/things we use to compute. A
  computer that only uses one symbol (for example a computer that adds
  using
  marbles) would be pretty useless.
  It does matter in many different ways: Speed of computations,
 effciency
  of
  computation, amount of memory, efficiency of memory, ease of
 programming,
  size of programs, ease of interpreting the result, amount of layers of
  programming to interpret the result and to program efficiently, ease
 of
  introspecting into the state of a computer...
 
 
  Practically they might matter but not theoretically.
 In the right theoretical model, it does matter. I am precisely doubting
 the
 value of adhering to our simplistic theoretical model of computation as
 the
 essence of what computation means.


 What model do you propose to replace it?
 
 The Church-Turing thesis plays a similar role in computer science as the
 fundamental theorem of arithmetic does in number theory.
None. There is no one correct model of computations. There are infinite
models that express different facets of what computation is. Different
turing machines express different things, super-recursive turing machines
express another thing, etc...
I think computer scientists just don't want to accept it, because it takes
their bible away. We like to have an easy answer, even if it is the wrong
one.


Jason Resch-2 wrote:
 

 Jason Resch-2 wrote:
 
 
  Why would we abstract from all that and then reduce computation to our
  one
  very abstract and imcomplete model of computation?
  If we do this we could as well abstract from the process of
 computation
  and
  say every string can be used to emulate any machine, because if you
 know
  what program it expresses, you know what it would compute (if
 correctly
  interpreted). There's no fundamental difference. Strings need to be
  interpreted to make sense as a program, and a turing machine without
  negative numbers needs to be interpreted to make sense as a program
  computing the result of an equation using negative numbers.
 
 
  I agree, strings need to be interpreted.  This is what the Turing
 machine
  does.  The symbols on the tape become interrelated in the context of
 the
  machine that interprets the symbols and it is these relations that
 become
  equivalent.
 That is like postulating some magic in the turing machine. It just
 manipulates symbols.

 
 No, it is not magic.  It is equivalent to saying the laws of physics
 interrelate every electron and quark to each other.
It is more like saying that the laws of physics show how to create humans
from atoms.
This is not the case. Nothing in the laws of nature says that some atoms
form a human. Still it is evidently the case that there are humans, meaning
that the laws of nature just don't describe the higher 

Re: Why the Church-Turing thesis?

2012-09-08 Thread benjayk

As far as I see, we mostly agree on content. 

I just can't make sense of reducing computation to emulability.
For me the intuitive sene of computation is much more rich than this.

But still, as I think about it, we can also create a model of computation
(in the sense of being intuitively computational and being implementable on
a computer) where there are computations that can't be emulated by universal
turing machine, using level breaking languages (which explicitly refer to
what is being computed on the base level). I'll write another post on this.

benjayk
-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34406986.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-08 Thread Quentin Anciaux
2012/9/8 benjayk benjamin.jaku...@googlemail.com


 I just respond to some parts of your posts, because I'd rather discuss the
 main points than get sidetracked with issues that are less fundamental.


 Jason Resch-2 wrote:
 
 
  I admit that for numbers this is not so relevant because number
 relations
  can be quite clearly expressed using numerous symbols (they have very
 few
  and simple relations), but it is much more relevant for more complex
  relations.
 
 
  Complex relation can be expressed in terms of a series of interrelated
  simpler relations (addition, multiplication, comparison, etc.).  You are
  focused on the very lowest level and it is no wonder you cannot see the
  higher-level possibilities for meaning, relations, intelligence,
  consciousness, etc. that a machine can create.
 The complex relations can often only be expressed as simple relations on a
 meta-level (which is a very big step of abstraction). You can express
 rational numbers using natural numbers, but only using an additional layer
 of interpretation (which is a *huge* abstraction - it's the difference
 between description and what is being described).

 The natural numbers itself don't lead to the rational numbers (except by
 adding additional relations, like the inverse of multiplication).


 Jason Resch-2 wrote:
 
  The relation of hot vs. cold as experienced by you is also the
  production of a long series of multiplications, additions, comparisons,
  and
  other operations.
 You assume reductionism or emergentism here. Of course you can defend the
 CT
 thesis if you assume that the lowest level can magically lead to higher
 levels (or the higher levels are not real in the first place).
 The problem is that this magic would precisely be the higher levels that
 you
 wanted to derive.


 Jason Resch-2 wrote:
 
 
 
  Jason Resch-2 wrote:
  
   For example it cannot directly compute
-1*-1=1. Machine A can only be used to use an encoded input value
  and
encoded description of machine B, and give an output that is
  correct
given
the right decoding scheme.
   
   
1's or 0's, X's or O's, what the symbols are don't have any bearing
  on
what
they can compute.
   
   That's just an assertion of the belief I am trying to question here.
   In reality, it *does* matter which symbols/things we use to compute.
 A
   computer that only uses one symbol (for example a computer that adds
   using
   marbles) would be pretty useless.
   It does matter in many different ways: Speed of computations,
  effciency
   of
   computation, amount of memory, efficiency of memory, ease of
  programming,
   size of programs, ease of interpreting the result, amount of layers
 of
   programming to interpret the result and to program efficiently, ease
  of
   introspecting into the state of a computer...
  
  
   Practically they might matter but not theoretically.
  In the right theoretical model, it does matter. I am precisely doubting
  the
  value of adhering to our simplistic theoretical model of computation as
  the
  essence of what computation means.
 
 
  What model do you propose to replace it?
 
  The Church-Turing thesis plays a similar role in computer science as the
  fundamental theorem of arithmetic does in number theory.
 None. There is no one correct model of computations. There are infinite
 models that express different facets of what computation is. Different
 turing machines express different things, super-recursive turing machines
 express another thing, etc...
 I think computer scientists just don't want to accept it, because it takes
 their bible away. We like to have an easy answer, even if it is the wrong
 one.


 Jason Resch-2 wrote:
 
 
  Jason Resch-2 wrote:
  
  
   Why would we abstract from all that and then reduce computation to
 our
   one
   very abstract and imcomplete model of computation?
   If we do this we could as well abstract from the process of
  computation
   and
   say every string can be used to emulate any machine, because if you
  know
   what program it expresses, you know what it would compute (if
  correctly
   interpreted). There's no fundamental difference. Strings need to be
   interpreted to make sense as a program, and a turing machine without
   negative numbers needs to be interpreted to make sense as a program
   computing the result of an equation using negative numbers.
  
  
   I agree, strings need to be interpreted.  This is what the Turing
  machine
   does.  The symbols on the tape become interrelated in the context of
  the
   machine that interprets the symbols and it is these relations that
  become
   equivalent.
  That is like postulating some magic in the turing machine. It just
  manipulates symbols.
 
 
  No, it is not magic.  It is equivalent to saying the laws of physics
  interrelate every electron and quark to each other.
 It is more like saying that the laws of physics show how to create humans
 from atoms.
 This is not the case. Nothing in 

Re: Why the Church-Turing thesis?

2012-09-07 Thread Bruno Marchal


On 28 Aug 2012, at 21:57, benjayk wrote:



It seems that the Church-Turing thesis, that states that an  
universal turing
machine can compute everything that is intuitively computable, has  
near

universal acceptance among computer scientists.


Yes indeed. I think there are two strong arguments for this.

The empirical one: all attempts to define the set of computable  
functions have led to the same class of functions, and this despite  
the quite independent path leading to the definitions (from Church  
lambda terms, Post production systems, von Neumann machine, billiard  
ball, combinators, cellular automata ... up to modular functor,  
quantum topologies, quantum computers, etc.).


The conceptual one: the class of computable functions is closed for  
the most transcendental operation in math: diagonalization. This is  
not the case for the notions of definability, provability,  
cardinality, etc.






I really wonder why this is so, given that there are simple cases  
where we
can compute something that an abitrary turing machine can not  
compute using
a notion of computation that is not extraordinary at all (and quite  
relevant

in reality).
For example, given you have a universal turing machine A that uses the
alphabet {1,0} and a universal turing machine B that uses the alphabet
{-1,0,1}.
Now it is quite clear that the machine A cannot directly answer any
questions that relates to -1. For example it cannot directly compute
-1*-1=1. Machine A can only be used to use an encoded input value and
encoded description of machine B, and give an output that is correct  
given

the right decoding scheme.
But for me this already makes clear that machine A is less  
computationally

powerful than machine B.


Church thesis concerns only the class of computable functions. The  
alphabet used by the Turing machine, having 1, 2, or enumerable  
alphabet does not change the class. If you dovetail on the works of 1  
letter Turing machine, you will unavoidably emulate all Turing  
machines on all finite and enumerable letters alphabets. This can be  
proved. Nor does the number of tapes, and/or  parallelism change that  
class.
Of course, some machine can be very inefficient, but this, by  
definition, does not concern Church thesis.


There was a thesis, often attributed to Cook (but I met him and he  
claims it is not his thesis), that all Turing machine can emulate  
themselves in polynomial time. This will plausibly be refuted by the  
existence of quantum computers (unless P = NP, or things like that).  
It is an open problem, but most scientists believe that in general a  
classical computer cannot emulate an arbitrary quantum computer in  
polynomial time. But I insist, quantum computer have not violated the  
Church Turing Post Markov thesis.






Its input and output when emulating B do only make
sense with respect to what the machine B does if we already know what
machine B does, and if it is known how we chose to reflect this in  
the input
of machine A (and the interpretation of its output). Otherwise we  
have no
way of even saying whether it emulates something, or whether it is  
just

doing a particular computation on the alphabet {1,0}.
I realize that it all comes down to the notion of computation. But  
why do
most choose to use such a weak notion of computation? How does  
machine B not

compute something that A doesn't by any reasonable standard?
Saying that A can compute what B computes is like saying that  
orange can
express the same as the word apple, because we can encode the word  
apple
as orange. It is true in a very limited sense, but it seems mad to  
treat
it as the foundation of what it means for words to express something  
(and

the same goes for computation).
If we use such trivial notions of computation, why not say that the  
program
return input emulates all turing-machines because given the right  
input it

gives the right output (we just give it the solution as input).
I get that we can simply use the Church-turing as the definition of
computation means. But why is it (mostly) treated as being the one  
and only
correct notion of computation (especially in a computer science  
context)?
The only explanation I have is that it is dogma. To question it  
would change
to much and would be too complicated and uncomfortable. It would  
make
computation an irreducibly complex and relative notion or - heaven  
forbid -
even an inherently subjective notion (computation from which  
perspective?).


That was what everybody believed before the rise of the universal  
machine and lambda calculus. Gödel called the closure of the  
computable functions for diagonalization a miracle, and he took time  
before assessing it. See:


DAVIS M., 1982, Why Gödel Didn't Have Church's Thesis, Information and  
Control

54,.pp. 3-24.


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this 

Re: Why the Church-Turing thesis?

2012-09-07 Thread benjayk


Bruno Marchal wrote:
 
 
 On 28 Aug 2012, at 21:57, benjayk wrote:
 

 It seems that the Church-Turing thesis, that states that an  
 universal turing
 machine can compute everything that is intuitively computable, has  
 near
 universal acceptance among computer scientists.
 
 Yes indeed. I think there are two strong arguments for this.
 
 The empirical one: all attempts to define the set of computable  
 functions have led to the same class of functions, and this despite  
 the quite independent path leading to the definitions (from Church  
 lambda terms, Post production systems, von Neumann machine, billiard  
 ball, combinators, cellular automata ... up to modular functor,  
 quantum topologies, quantum computers, etc.).
 
OK, now I understand it better. Apparently if we express a computation in
terms of a computable function we can always arrive at the same computable
function using a different computation of an abitrary turing universal
machine. That seems right to me.
 
But in this case I don't get why it is often claimed that CT thesis claims
that all computations can be done by a universal turing machine, not merely
that they lead to the same class of computable functions (if converted
appriopiately).
The latter is a far weaker statement, since computable functions abstract
from many relevant things about the machine.

And even this weaker statement doesn't seem true with regards to more
powerful models like super-recursive functions, as computable functions just
give finite results, while super-recursive machine can give
infinite/unlimited results.

 

Bruno Marchal wrote:
 
 The conceptual one: the class of computable functions is closed for  
 the most transcendental operation in math: diagonalization. This is  
 not the case for the notions of definability, provability,  
 cardinality, etc.
I don't really know what this means. Do you mean that there are just
countable many computations? If yes, what has this do with whether all
universal turing machines are equivalent?



Bruno Marchal wrote:
 

 I really wonder why this is so, given that there are simple cases  
 where we
 can compute something that an abitrary turing machine can not  
 compute using
 a notion of computation that is not extraordinary at all (and quite  
 relevant
 in reality).
 For example, given you have a universal turing machine A that uses the
 alphabet {1,0} and a universal turing machine B that uses the alphabet
 {-1,0,1}.
 Now it is quite clear that the machine A cannot directly answer any
 questions that relates to -1. For example it cannot directly compute
 -1*-1=1. Machine A can only be used to use an encoded input value and
 encoded description of machine B, and give an output that is correct  
 given
 the right decoding scheme.
 But for me this already makes clear that machine A is less  
 computationally
 powerful than machine B.
 
 Church thesis concerns only the class of computable functions.
Hm, maybe the wikipedia article is a bad one, since it mentioned computable
functions just as means of explaining it, not as part of its definition.


Bruno Marchal wrote:
 
  The  alphabet used by the Turing machine, having 1, 2, or enumerable  
 alphabet does not change the class. If you dovetail on the works of 1  
 letter Turing machine, you will unavoidably emulate all Turing  
 machines on all finite and enumerable letters alphabets. This can be  
 proved. Nor does the number of tapes, and/or  parallelism change that  
 class.
 Of course, some machine can be very inefficient, but this, by  
 definition, does not concern Church thesis.
Even so, CT thesis makes a claim about the equivalence of machines, not of
emulability.
Why are two machines that can be used to emlate each other regarded to be
equivalent?
In my view, there is a big difference between computing the same and being
able to emulate each other. Most importantly, emulation only makes sense
relative to another machine that is being emulated, and a correct
interpretation.

benjayk

-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34401986.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-07 Thread benjayk


Jason Resch-2 wrote:
 
 On Thu, Sep 6, 2012 at 12:47 PM, benjayk
 benjamin.jaku...@googlemail.comwrote:
 


 Jason Resch-2 wrote:
 
  On Tue, Aug 28, 2012 at 2:57 PM, benjayk
  benjamin.jaku...@googlemail.comwrote:
 
 
  It seems that the Church-Turing thesis, that states that an universal
  turing
  machine can compute everything that is intuitively computable, has
 near
  universal acceptance among computer scientists.
 
  I really wonder why this is so, given that there are simple cases
 where
  we
  can compute something that an abitrary turing machine can not compute
  using
  a notion of computation that is not extraordinary at all (and quite
  relevant
  in reality).
  For example, given you have a universal turing machine A that uses the
  alphabet {1,0} and a universal turing machine B that uses the alphabet
  {-1,0,1}.
  Now it is quite clear that the machine A cannot directly answer any
  questions that relates to -1.

 
 I see this at all being the case at all.  What is the symbol for -1
 supposed to look like?  Do you agree that a turing machine that used A, B,
 and C as symbols could work the same as one that used -1, 0, and 1?
Well, the symbol for -1 could be -1?
To answer your latter question, no, not necessarily. I don't take the
symbols not to be mere symbols, but to contain meaning (which they do), and
so it matters what symbols the machine use, because that changes the meaning
of its computation. Often times the meaning of the symbols also constrain
the possible relations (for example -1 * -1 normally needs to be 1, while A
* A could be A, B or C).

CT thesis wants to abstract from things like meaning, but I don't really see
the great value in acting like this is necessarily the correct theoretical
way of thinking about computations. It is only valuable as one possible,
very strongly abstracted, limited and representational model of computation
with respect to emulability.


Jason Resch-2 wrote:
 
 Everything is a representation, but what is important is that the Turing
 machine preserves the relationships.  E.g., if ABBBABAA is greater than
 AAABBAAB then 01110100 is greater than 00011001, and all the other
 properties can hold, irrespective of what symbols are used.
The problem is that relationships don't make sense apart from symbols. We
can theoretically express the natural numbers using an infinite numbers of
unique symbols for both numbers and operations (like A or B or C or X for
10, ´ or ? or [ or ° for +), but in this case it won't be clear that we are
expresing natural numbers at all (without a lengthy explanation of what the
symbols mean).
Or if we are using binary numbers to express the natural numbers, it will
also be not very clear that we mean numbers, because often binary
expressions mean something entirely else. If we then add 1 to this number
it will not be clear that we actually added one, or if we just flipped a
bit.

I admit that for numbers this is not so relevant because number relations
can be quite clearly expressed using numerous symbols (they have very few
and simple relations), but it is much more relevant for more complex
relations.


Jason Resch-2 wrote:
 
 For example it cannot directly compute
  -1*-1=1. Machine A can only be used to use an encoded input value and
  encoded description of machine B, and give an output that is correct
  given
  the right decoding scheme.
 
 
  1's or 0's, X's or O's, what the symbols are don't have any bearing on
  what
  they can compute.
 
 That's just an assertion of the belief I am trying to question here.
 In reality, it *does* matter which symbols/things we use to compute. A
 computer that only uses one symbol (for example a computer that adds
 using
 marbles) would be pretty useless.
 It does matter in many different ways: Speed of computations, effciency
 of
 computation, amount of memory, efficiency of memory, ease of programming,
 size of programs, ease of interpreting the result, amount of layers of
 programming to interpret the result and to program efficiently, ease of
 introspecting into the state of a computer...

 
 Practically they might matter but not theoretically.
In the right theoretical model, it does matter. I am precisely doubting the
value of adhering to our simplistic theoretical model of computation as the
essence of what computation means.


Jason Resch-2 wrote:
 

 Why would we abstract from all that and then reduce computation to our
 one
 very abstract and imcomplete model of computation?
 If we do this we could as well abstract from the process of computation
 and
 say every string can be used to emulate any machine, because if you know
 what program it expresses, you know what it would compute (if correctly
 interpreted). There's no fundamental difference. Strings need to be
 interpreted to make sense as a program, and a turing machine without
 negative numbers needs to be interpreted to make sense as a program
 computing the result of an equation using negative numbers.

 
 I agree, 

Re: Why the Church-Turing thesis?

2012-09-07 Thread Stephen P. King

On 9/7/2012 6:24 AM, benjayk wrote:

Why are two machines that can be used to emlate each other regarded to be
equivalent?
In my view, there is a big difference between computing the same and being
able to emulate each other. Most importantly, emulation only makes sense
relative to another machine that is being emulated, and a correct
interpretation.

Dear benjayk,

This is what is discussed under the header of Bisimilarity and 
bisimulation equivalence iff simulation = emulation.


http://en.wikipedia.org/wiki/Bisimulation

a bisimulation is a binary relation between state transition systems, 
associating systems which behave in the same way in the sense that one 
system simulates the other and vice-versa.


My own use of the term seeks a more generalized version that does 
not assume that the relation is necessarily binary nor strictly 
monotonic. The key is that a pair of machines can have an image of 
each other and that they are capable of acting on that image.


--
Onward!

Stephen

http://webpages.charter.net/stephenk1/Outlaw/Outlaw.html


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Why the Church-Turing thesis?

2012-09-06 Thread benjayk


Jason Resch-2 wrote:
 
 On Tue, Aug 28, 2012 at 2:57 PM, benjayk
 benjamin.jaku...@googlemail.comwrote:
 

 It seems that the Church-Turing thesis, that states that an universal
 turing
 machine can compute everything that is intuitively computable, has near
 universal acceptance among computer scientists.

 I really wonder why this is so, given that there are simple cases where
 we
 can compute something that an abitrary turing machine can not compute
 using
 a notion of computation that is not extraordinary at all (and quite
 relevant
 in reality).
 For example, given you have a universal turing machine A that uses the
 alphabet {1,0} and a universal turing machine B that uses the alphabet
 {-1,0,1}.
 Now it is quite clear that the machine A cannot directly answer any
 questions that relates to -1. For example it cannot directly compute
 -1*-1=1. Machine A can only be used to use an encoded input value and
 encoded description of machine B, and give an output that is correct
 given
 the right decoding scheme.

 
 1's or 0's, X's or O's, what the symbols are don't have any bearing on
 what
 they can compute.
 
That's just an assertion of the belief I am trying to question here.
In reality, it *does* matter which symbols/things we use to compute. A
computer that only uses one symbol (for example a computer that adds using
marbles) would be pretty useless.
It does matter in many different ways: Speed of computations, effciency of
computation, amount of memory, efficiency of memory, ease of programming,
size of programs, ease of interpreting the result, amount of layers of
programming to interpret the result and to program efficiently, ease of
introspecting into the state of a computer...

Why would we abstract from all that and then reduce computation to our one
very abstract and imcomplete model of computation?
If we do this we could as well abstract from the process of computation and
say every string can be used to emulate any machine, because if you know
what program it expresses, you know what it would compute (if correctly
interpreted). There's no fundamental difference. Strings need to be
interpreted to make sense as a program, and a turing machine without
negative numbers needs to be interpreted to make sense as a program
computing the result of an equation using negative numbers.


Jason Resch-2 wrote:
 
 Consider: No physical computer today uses 1's or 0's, they use voltages,
 collections of more or fewer electrons.
OK, but in this case abstraction makes sense for computer scientist because
progamers don't have access to that level. You are right, though that a chip
engineer shouldn't abstract from that level if he actually wants to build a
computer.


Jason Resch-2 wrote:
 
 This doesn't mean that our computers can only directly compute what
 electrons do.
In fact they do much more.  Electrons express strictly more than just 0 and
1. So it's not a good anology, because 0 and 1 express *less* than 0, 1 and
-1.


Jason Resch-2 wrote:
 
 But for me this already makes clear that machine A is less computationally
 powerful than machine B. Its input and output when emulating B do only
 make
 sense with respect to what the machine B does if we already know what
 machine B does, and if it is known how we chose to reflect this in the
 input
 of machine A (and the interpretation of its output). Otherwise we have no
 way of even saying whether it emulates something, or whether it is just
 doing a particular computation on the alphabet {1,0}.

 
 These are rather convincing:
 http://en.wikipedia.org/wiki/Video_game_console_emulator
 
 There is software that emulates the unique architectures of an Atari,
 Nintendo, Supernintendo, PlayStation, etc. systems.  These emulators can
 also run on any computer, whether its Intel X86, x86_64, PowerPC, etc. 
 You
 will have a convincing experience of playing an old Atari game like space
 invaders, even though the original creators of that program never intended
 it to run on a computer architecture that wouldn't be invented for another
 30 years, and the original programmers didn't have to be called in to
 re-write their program to do so.
Yes, I use them as well. They are indeed convincing. But this doesn't really
relate to the question very much.
First, our modern computers are pretty much strictly more computationally
powerful in every practical and theoretical way. It would be more of an
argument if you would simulate a windows on a nintendo (but you can't). I am
not saying that a turing machine using 0, 1 and -1 can't emulate a machine
using only 0 and 1.
Secondly, even this emulations are just correct as far as our playing
experience goes (well, at least if you are not nostalgic about hardware).
The actual process going on in the computer is very different, and thus it
makes sense to say that it computes something else. Its computation just
have a similar results in terms of experience, but they need vastly more
ressources and compute something more (all the 

Re: Why the Church-Turing thesis?

2012-09-06 Thread Jason Resch
On Thu, Sep 6, 2012 at 12:47 PM, benjayk benjamin.jaku...@googlemail.comwrote:



 Jason Resch-2 wrote:
 
  On Tue, Aug 28, 2012 at 2:57 PM, benjayk
  benjamin.jaku...@googlemail.comwrote:
 
 
  It seems that the Church-Turing thesis, that states that an universal
  turing
  machine can compute everything that is intuitively computable, has near
  universal acceptance among computer scientists.
 
  I really wonder why this is so, given that there are simple cases where
  we
  can compute something that an abitrary turing machine can not compute
  using
  a notion of computation that is not extraordinary at all (and quite
  relevant
  in reality).
  For example, given you have a universal turing machine A that uses the
  alphabet {1,0} and a universal turing machine B that uses the alphabet
  {-1,0,1}.
  Now it is quite clear that the machine A cannot directly answer any
  questions that relates to -1.


I see this at all being the case at all.  What is the symbol for -1
supposed to look like?  Do you agree that a turing machine that used A, B,
and C as symbols could work the same as one that used -1, 0, and 1?
Everything is a representation, but what is important is that the Turing
machine preserves the relationships.  E.g., if ABBBABAA is greater than
AAABBAAB then 01110100 is greater than 00011001, and all the other
properties can hold, irrespective of what symbols are used.


 For example it cannot directly compute
  -1*-1=1. Machine A can only be used to use an encoded input value and
  encoded description of machine B, and give an output that is correct
  given
  the right decoding scheme.
 
 
  1's or 0's, X's or O's, what the symbols are don't have any bearing on
  what
  they can compute.
 
 That's just an assertion of the belief I am trying to question here.
 In reality, it *does* matter which symbols/things we use to compute. A
 computer that only uses one symbol (for example a computer that adds using
 marbles) would be pretty useless.
 It does matter in many different ways: Speed of computations, effciency of
 computation, amount of memory, efficiency of memory, ease of programming,
 size of programs, ease of interpreting the result, amount of layers of
 programming to interpret the result and to program efficiently, ease of
 introspecting into the state of a computer...


Practically they might matter but not theoretically.



 Why would we abstract from all that and then reduce computation to our one
 very abstract and imcomplete model of computation?
 If we do this we could as well abstract from the process of computation and
 say every string can be used to emulate any machine, because if you know
 what program it expresses, you know what it would compute (if correctly
 interpreted). There's no fundamental difference. Strings need to be
 interpreted to make sense as a program, and a turing machine without
 negative numbers needs to be interpreted to make sense as a program
 computing the result of an equation using negative numbers.


I agree, strings need to be interpreted.  This is what the Turing machine
does.  The symbols on the tape become interrelated in the context of the
machine that interprets the symbols and it is these relations that become
equivalent.




 Jason Resch-2 wrote:
 
  Consider: No physical computer today uses 1's or 0's, they use voltages,
  collections of more or fewer electrons.
 OK, but in this case abstraction makes sense for computer scientist because
 progamers don't have access to that level. You are right, though that a
 chip
 engineer shouldn't abstract from that level if he actually wants to build a
 computer.


 Jason Resch-2 wrote:
 
  This doesn't mean that our computers can only directly compute what
  electrons do.
 In fact they do much more.  Electrons express strictly more than just 0 and
 1. So it's not a good anology, because 0 and 1 express *less* than 0, 1 and
 -1.


 Jason Resch-2 wrote:
 
  But for me this already makes clear that machine A is less
 computationally
  powerful than machine B. Its input and output when emulating B do only
  make
  sense with respect to what the machine B does if we already know what
  machine B does, and if it is known how we chose to reflect this in the
  input
  of machine A (and the interpretation of its output). Otherwise we have
 no
  way of even saying whether it emulates something, or whether it is just
  doing a particular computation on the alphabet {1,0}.
 
 
  These are rather convincing:
  http://en.wikipedia.org/wiki/Video_game_console_emulator
 
  There is software that emulates the unique architectures of an Atari,
  Nintendo, Supernintendo, PlayStation, etc. systems.  These emulators can
  also run on any computer, whether its Intel X86, x86_64, PowerPC, etc.
  You
  will have a convincing experience of playing an old Atari game like space
  invaders, even though the original creators of that program never
 intended
  it to run on a computer architecture that wouldn't be invented for
 another
  

Re: Why the Church-Turing thesis?

2012-08-28 Thread Jason Resch
On Tue, Aug 28, 2012 at 2:57 PM, benjayk benjamin.jaku...@googlemail.comwrote:


 It seems that the Church-Turing thesis, that states that an universal
 turing
 machine can compute everything that is intuitively computable, has near
 universal acceptance among computer scientists.

 I really wonder why this is so, given that there are simple cases where we
 can compute something that an abitrary turing machine can not compute using
 a notion of computation that is not extraordinary at all (and quite
 relevant
 in reality).
 For example, given you have a universal turing machine A that uses the
 alphabet {1,0} and a universal turing machine B that uses the alphabet
 {-1,0,1}.
 Now it is quite clear that the machine A cannot directly answer any
 questions that relates to -1. For example it cannot directly compute
 -1*-1=1. Machine A can only be used to use an encoded input value and
 encoded description of machine B, and give an output that is correct given
 the right decoding scheme.


1's or 0's, X's or O's, what the symbols are don't have any bearing on what
they can compute.

Consider: No physical computer today uses 1's or 0's, they use voltages,
collections of more or fewer electrons.

This doesn't mean that our computers can only directly compute what
electrons do.

But for me this already makes clear that machine A is less computationally
 powerful than machine B. Its input and output when emulating B do only make
 sense with respect to what the machine B does if we already know what
 machine B does, and if it is known how we chose to reflect this in the
 input
 of machine A (and the interpretation of its output). Otherwise we have no
 way of even saying whether it emulates something, or whether it is just
 doing a particular computation on the alphabet {1,0}.


These are rather convincing:
http://en.wikipedia.org/wiki/Video_game_console_emulator

There is software that emulates the unique architectures of an Atari,
Nintendo, Supernintendo, PlayStation, etc. systems.  These emulators can
also run on any computer, whether its Intel X86, x86_64, PowerPC, etc.  You
will have a convincing experience of playing an old Atari game like space
invaders, even though the original creators of that program never intended
it to run on a computer architecture that wouldn't be invented for another
30 years, and the original programmers didn't have to be called in to
re-write their program to do so.


 I realize that it all comes down to the notion of computation. But why do
 most choose to use such a weak notion of computation? How does machine B
 not
 compute something that A doesn't by any reasonable standard?
 Saying that A can compute what B computes is like saying that orange can
 express the same as the word apple, because we can encode the word
 apple
 as orange.


System A (using its own language of representation for system A), can
predict exactly all future states of another system B (and vice versa).  A
and B have different symbols, states, instructions, etc., so perhaps this
is why you think system A can't perfectly emulate system B, but this is a
little like saying there are things that can only be described by Spanish
speakers that no other language (French, English, etc.) could describe.
 Sure, a translation needs to occur to communicate a Spanish idea into an
English one, but just because spanish and english speakers use a different
language doesn't mean there are problems only speakers of one language can
solve.


 It is true in a very limited sense, but it seems mad to treat
 it as the foundation of what it means for words to express something (and
 the same goes for computation).
 If we use such trivial notions of computation, why not say that the program
 return input emulates all turing-machines because given the right input
 it
 gives the right output (we just give it the solution as input).


Many programs have no input and/or no output, but they still can be
rightfully said to perform different computations.


 I get that we can simply use the Church-turing as the definition of
 computation means. But why is it (mostly) treated as being the one and only
 correct notion of computation (especially in a computer science context)?


I think it more comes into play in the a definition of a universal machine,
than in the definition of computation.

It is useful because it makes it easy to prove.  All you need to do is show
how some machine can be used to emulate any other known turning universal
machine.




The only explanation I have is that it is dogma. To question it would change
 to much and would be too complicated and uncomfortable. It would make
 computation an irreducibly complex and relative notion or - heaven forbid -
 even an inherently subjective notion (computation from which perspective?).


Reverse engineering machine language code is very difficult, but there are
automated programs for doing this that can provide much more readable
program code.  Code too, can be difficult to