Jones Beene <jone...@pacbell.net> wrote:

> I think the problem is not whether computers "should be designed to be
> sentient," so much as "can they be restrained from it."
>

May-bee. I have read various articles and books about this. Some experts
believe that sentience is an emergent quality, others say it would have be
programmed, and it will not happen on its own. I do not know enough about
artificial intelligence to judge which is right.

Arthur Clarke was very interested in this question. He asked the world's
leading experts. I think he got the same impression I have, which is that
they do not know yet.

Putting aside sentience, people and other animals a number of emotional
qualities that I do not expect to see in artificially intelligent
computers, unless we deliberately program them. They include: love, fear,
jealousy, the desire for self preservation (that is, fear of death), the
urge to dominate other entities, the urge to accumulate power, status and
material goods, and so on. Needless to say, I cannot begin to imagine a
machine that would want to become a human being or have sex with a human,
along the lines of the movie "Bicentennial Man." (I thought that movie
stank.)

I think it would be morally wrong to deliberately program such emotions. I
can't think of any any advantage they would give us.

- Jed

Reply via email to