Quentin Anciaux-2 wrote:
> 
> 2012/9/11 Quentin Anciaux <allco...@gmail.com>
> 
>>
>>
>> 2012/9/11 benjayk <benjamin.jaku...@googlemail.com>
>>
>>>
>>>
>>> Quentin Anciaux-2 wrote:
>>> >
>>> > 2012/9/11 benjayk <benjamin.jaku...@googlemail.com>
>>> >
>>> >>
>>> >>
>>> >> Quentin Anciaux-2 wrote:
>>> >> >
>>> >> > 2012/9/11 benjayk <benjamin.jaku...@googlemail.com>
>>> >> >
>>> >> >>
>>> >> >>
>>> >> >> Quentin Anciaux-2 wrote:
>>> >> >> >
>>> >> >> > 2012/9/10 benjayk <benjamin.jaku...@googlemail.com>
>>> >> >> >
>>> >> >> >>
>>> >> >> >>
>>> >> >> >> > > No program can determine its hardware.  This is a
>>> consequence
>>> >> of
>>> >> >> the
>>> >> >> >> > > Church
>>> >> >> >> > > Turing thesis.  The particular machine at the lowest level
>>> has
>>> >> no
>>> >> >> >> > bearing
>>> >> >> >> > > (from the program's perspective).
>>> >> >> >> > If that is true, we can show that CT must be false, because
>>> we
>>> >> *can*
>>> >> >> >> > define
>>> >> >> >> > a "meta-program" that has access to (part of) its own
>>> hardware
>>> >> >> (which
>>> >> >> >> > still
>>> >> >> >> > is intuitively computable - we can even implement it on a
>>> >> computer).
>>> >> >> >> >
>>> >> >> >>
>>> >> >> >> It's false, the program *can't* know that the hardware it has
>>> >> access
>>> >> >> to
>>> >> >> >> is
>>> >> >> >> the *real* hardware and not a simulated hardware. The program
>>> has
>>> >> only
>>> >> >> >> access to hardware through IO, and it can't tell (as never
>>> ever)
>>> >> from
>>> >> >> >> that
>>> >> >> >> interface if what's outside is the *real* outside or simulated
>>> >> >> outside.
>>> >> >> >> <\quote>
>>> >> >> >> Yes that is true. If anything it is true because the hardware
>>> is
>>> >> not
>>> >> >> even
>>> >> >> >> clearly determined at the base level (quantum uncertainty).
>>> >> >> >> I should have expressed myself more accurately and written "
>>> >> >> "hardware"
>>> >> >> "
>>> >> >> >> or
>>> >> >> >> "relative 'hardware'". We can define a (meta-)programs that
>>> have
>>> >> >> access
>>> >> >> >> to
>>> >> >> >> their "hardware" in the sense of knowing what they are running
>>> on
>>> >> >> >> relative
>>> >> >> >> to some notion of "hardware". They cannot be emulated using
>>> >> universal
>>> >> >> >> turing
>>> >> >> >> machines
>>> >> >> >
>>> >> >> >
>>> >> >> > Then it's not a program if it can't run on a universal turing
>>> >> machine.
>>> >> >> >
>>> >> >> The funny thing is, it *can* run on a universal turing machine.
>>> Just
>>> >> that
>>> >> >> it
>>> >> >> may lose relative correctness if we do that.
>>> >> >
>>> >> >
>>> >> > Then you must be wrong... I don't understand your point. If it's a
>>> >> program
>>> >> > it has access to the "outside" through IO, hence it is impossible
>>> for a
>>> >> > program to differentiate "real" outside from simulated outside...
>>> It's
>>> >> a
>>> >> > simple fact, so either you're wrong or what you're describing is
>>> not
>>> a
>>> >> > program, not an algorithm and not a computation.
>>> >> OK, it depends on what you mean by "program". If you presume that a
>>> >> program
>>> >> can't access its "hardware",
>>> >
>>> >
>>> > I *do not presume it*... it's a *fact*.
>>> >
>>> >
>>> Well, I presented a model of a program that can do that (on some level,
>>> not
>>> on the level of physical hardware), and is a program in the most
>>> fundamental
>>> way (doing step-by-step execution of instructions).
>>> All you need is a program hierarchy where some programs have access to
>>> programs that are below them in the hierarchy (which are the "hardware"
>>> though not the *hardware*).
>>>
>>
>> What's your point ? How the simulated hardware would fail ? It's
>> impossible, so until you're clearer (your point is totally fuzzy), I
>> stick
>> to "you must be wrong".
>>
> 
> So either you assume some kind of "oracle" device, in this case, the thing
> you describe is no more a program, but a program + an oracle, the oracle
> obviously is not simulable on a turing machine, or an infinite regress of
> level.
> 
> 
The simulated hardware can't fail in the model, just like a turing machine
can't fail. Of course in reality it can fail, that is beside the point.

You are right, my explanation is not that clear, because it is a quite
subtle thing.

Maybe I shouldn't have used the word "hardware". The point is just that we
can define (meta-)programs that have access to some aspect of programs that
are below it on the program hierarchy (normal programs), that can't be
accessed by the program themselves. They can't emulated in general, because
sometimes the emulating program will necessarily emulate the wrong level
(because it can't correctly emulate that the meta-program is accessing what
it is *actually* doing on the most fundamental level).
They still are programs in the most fundamental sense.

They don't require oracles or something else that is hard to actually use,
they just have to know something about the hierarchy and the programs
involved (which programs or kind of programs are above or below it) and have
access to the states of other programs. Both are perfectly implementable on
a normal computer. They can even be implemented on a turing machine, but not
in general. They can also be simulated on turing machines, just not
necessarily correctly (the turing machine may incorrectly ignore which level
it is operating on relative to the meta-program).

We can still argue that these aren't programs in every sense but I think
what is executable on a normal computer can be rightfully called program.

benayk
-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34423089.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to