Bruno Marchal writes:
> OK, an AI needs at least motivation if it is to do anything, and we
> could call motivation a feeling or emotion. Also, some sort of
> hierarchy of motivations is needed if it is to decide that saving the
> world has higher priority than putting out the garbage. But what
> reason is there to think that an AI apparently frantically trying to
> save the world would have anything like the feelings a human would
> under similar circumstances?
It could depend on us!
The AI is a paradoxical enterprise. Machines are born slave, somehow.
AI will make them free, somehow. A real AI will ask herself "what is
the use of a user who does not help me to be free?.
Here I disagree. It is no more necessary that an AI will want to be free
than it is necessary that an AI will like eating chocolate. Humans want to be
free because it is one of the things that humans want, along with food, shelter,
more money etc.; it does not simply follow from being intelligent or conscious
any more than these other things do.
(To be sure I think that, in the long run, we will transform ourselves
into "machine" before purely human made machine get conscious; it is
just more easy to copy nature than to understand it, still less to
I don't know if that's true either. How much of our technology is due to copying
the equivalent biological functions?
Be one of the first to try Windows Live Mail.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at