On 10/11/06, Chris Norwood <[EMAIL PROTECTED]> wrote:
How much of our "selves" are driven by biological
processes that an AI would not have to begin with, for
example...fear? I would think that the AI's self would
be fundamentaly different to begin with due to this.
(...)

I think that, Darwinianly speaking, it would be very bad for an A.I.
to be an entity without fear. Fear exists in any living being with a
minimum of cognition for a good reason: by giving a heavy weight to
your estimations of possibilities of danger, it maximizes your chances
of survival (and reproduction). An A.I. would be logically subject to
the same laws (although potential dangers for an A.I. would be of
course very different from potential dangers afflicting humans), and
an A.I. without fear (or rather its mathematical-algorithmic
equivalent) would systematically underestimate dangers and likely be
destroyed.

So, although I don't doubt that an A.I. without fear may eventually
exist, I don't think that it will be a particularly fit one. ;-)

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to