Michel Jullian wrote:
> Stephen, why do you postulate there must be a line? Like intelligence,
> consciousness could be non-discrete, simply increasing mechanically
> with the complexity of the organized system. Can't you imagine
> elaborate robots in the future thinking "I'm conscious; I'm certain of
> that, by direct experience."?

Sure can, and it sure could be a continuum.  I picked the top and bottom
 (humans and rocks) to be unambiguous, but everything in the middle is
at least somewhat unclear.

The point is, we don't know, and we don't know how to find out, and we
don't even have a good handle on how to properly phrase the question.
Right now, faced with your hypothetical robot which *asserts* that it is
experiencing consciousness, we'd have no way of testing that assertion.
 I can write a program which will print "I am conscious" but that
doesn't make it true.

Right now, it all boils down to opinions and "gut feel", which is kind
of remarkable given that it's pretty clearly a factual issue, and a
rather important issue at that, at least as far as all discussions of
morality go.

Yesterday we rescued a stray cat which was trying to scratch out a
living under our front porch (and not doing too well at it).  I believe
the cat is a conscious being, but I sure can't prove that it is.  If
it's conscious, then morally, rescuing it could be argued on utilitarian
grounds to be a "good thing": we made a conscious entity happier.  But
if it's not conscious, then the act was about as morally insignificant
as "rescuing" a junk car, or a discarded paper clip.  It seems curious
to me that there's no way to prove conclusively which category the act
falls into.

By the way, it's also very common to confuse "intelligence" with
"consciousness".  The former can be measured with some precision, of
course, unlike the latter.

> 
> A line would definitely have to be drawn for the concept of soul
> (either you have it or not), but not for consciousness I don't think.
> 
> Michel
> 

Reply via email to