Joshua Bell wrote:

> It's a Turing Test situation, IMHO. Until you can define what emotions are
> and prove that you have them and that a Tamagotchi doesn't, I'm not sure I
> want to make that call.

A laudable sense of restraint, but please allow me to set the bar for
genuine uncertainty at something higher than a Tamagotchi.  I figure I
make "that call" every time I blow away Tank Jr. in a game of Quake 3, or
pretty much any other bot in a well-made shooter style game. I'm pretty
sure most of these guys have more sophisticated a.i. than a Tamagotchi.
And I suspect that unless you've hooked your Tamagotchi to an
uninterruptible power supply and decided to devote enough time to make sure
the annoying little bastard never dies except of old age (Can they die of old age?
I've never had the patience to find out.), you've probably made "that
call" as well.

> This is reminiscent of the "zombie" problem of consciousness. Is it sensical
> to talk about something with all of the higher mental capabilities of a
> human, capable of passing for a human and interacting with humans, but
> missing some very specific part X?

I think it very well may be, because the devil is in the details, and I
think the problem is more difficult and ambiguous than detractors of the
movie A.I. really seem to want to give it credit for being.  I'll have to
rant about that later, though....

Marvin Long
Austin, Texas

Nuke the straight consumerist wildebeests for Buddha!

Reply via email to