Stathis Papaioannou wrote:
2010/1/11 Brent Meeker <meeke...@dslextreme.com>:

No, I'm at this point assuming only that consciousness is produced by
the physical process. We can assume for simplicity that the two
machines M1 and M2 have similar architecture and similar operating
systems. Once the program is loaded into M2 from the disc, S2 proceeds
exactly the same as it would have had the computation been allowed to
continue running on M1. Therefore, at least after the first few
milliseconds, the subjective content of S2 must be the same as it
would have been on the one machine. Could the subjective content be
different at the transition between S1 and S2 if the computation is
split up? If there is a subjective difference it won't be something
the subject can notice because, later in the course of S2, he can have
no memory of it.
But if you're only assuming that consciousness is produced by the physical
process then the process of downloading and uploading the microstates and
shifting the data into registers in the CPU and memory could produce a
difference in consciousness.  These are all computations too, done by the
operating system.  And why can't there be memory of it in the sense that it
effects some later conscious state?  There are traces of the transfer
process left on the original computer, the disc, and the second computer.
Some subsequent program could retrieve these traces, as is done in forensic
cases.  If physical processes instantiate consciousness, why shouldn't these
make a difference.

It's taken for granted even by unsophisticated end users of computers
that the hardware won't affect the computation. A calculator
application wouldn't be much use if it gave a different answer
depending on what brand of machine it was running on. It wouldn't be
difficult to write a program that takes input from the environment,
including information on what sort of hardware it's running on, and in
that case there could be a difference between running S1 and S2 on the
one machine and running them on separate machines. A real time clock,
for example, would alert the subject to the fact that there had been a
discontinuity, and S2 would then *not* proceed the same in both cases.
However, this would not happen automatically: it would have to be
specifically programmed, and the hardware would have to be capable of
feeding the appropriate input into the program.

It also can't be a difference that would disrupt the
completion of a task or thought that requires continuity of
consciousness spanning S1-S2, since again the subject cannot have any
evidence that such a disruption occurred.

Unless we have a theory of how consciousness is related to the physical
computation I don't think we can conclude that.  We already know that
subliminal perceptions can affect conscious thoughts - so why not subliminal
memories.

The theory is that if the computation is the same then the
consciousness is the same, regardless of what hardware it is being
implemented on.
I know. I'm trying to see what exactly is being assumed about the computation being "the same". Is it the same Platonic algorithm? Is it one that has the same steps as described in FORTRAN, but not those in LISP? Is it just one that has the same input-output? I think these are questions that have been bypassed in the "yes doctor" scenario. Saying "yes" to the doctor seems unproblematic when you think of replacing a few neurons with artificial ones - all you care about is the input-output. But then when you jump to replacing a whole brain maybe you care about the FORTRAN/LISP differences. Yet on this list there seems to be an assumption that you can just jump to the Platonic algorithm or even a Platonic computation that's independent of the algorithm. Bruno pushes all this aside by referring to "at the appropriate level" and by doing all possible algorithms. But I'm more interested in the question of what would I have to do to make a conscious AI. Also, it is the assumption of a Platonic computation that allows one to slice it discretely into OMs.

If you don't accept this then you don't accept
computationalism,

I don't accept it.  I only entertain it.

Brent

for it is difficult to imagine a more drastic
hardware change than that involved in going from a biological brain to
a digital computer.



-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.


Reply via email to