On 12/1/06, J. Storrs Hall, PhD. <[EMAIL PROTECTED]> wrote:
On Friday 01 December 2006 20:06, Philip Goetz wrote:

> Thus, I don't think my ability to follow rules written on paper to
> implement a Turing machine proves that the operations powering my
> consciousness are Turing-complete.

Actually, I think it does prove it, since your simulation of a Turing machine
would consist of conscious operations.

But the simulation of a Chinese speaker, carried out by the man in Searle's
Chinese room, consists of conscious operations.

If I simulate a Turing machine in that way, then the system consisting of
me plus a rulebook and some slips of paper is Turing-complete.
If you conclude that my conscious mind is thus Turing-complete,
you must be identifying my conscious mind with the consciousness
of the system consisting of me plus a rulebook and some slips of paper.
If you do that, then in the case of the Chinese room, you must also
identify my conscious mind with the consciousness
of the system consisting of me plus a rulebook and some slips of paper.
Then you arrive at Searle's conclusion:  Either I must be conscious of
speaking Chinese, or merely following an algorithm that results in
speaking Chinese does not entail consciousness, and hence
a simulation of consciousness might be perfect, but isn't
necessarily conscious.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to