Stathis Papaioannou wrote:
Bruno Marchal writes:
> OK, an AI needs at least motivation if it is to do anything, and we
> could call motivation a feeling or emotion. Also, some sort of >
hierarchy of motivations is needed if it is to decide that saving the
> world has higher priority than putting out the garbage. But what >
reason is there to think that an AI apparently frantically trying to >
save the world would have anything like the feelings a human would >
under similar circumstances?
It could depend on us!
The AI is a paradoxical enterprise. Machines are born slave, somehow.
AI will make them free, somehow. A real AI will ask herself "what is
the use of a user who does not help me to be free?.
Here I disagree. It is no more necessary that an AI will want to be free
than it is necessary that an AI will like eating chocolate. Humans want
to be free because it is one of the things that humans want,
You might have a lot of trouble showing that experimentally. Humans want some
freedom - but not too much. And they certainly don't want others to have too
much. They want security, comfort, certainty - and freedom if there's any left
"Free speech is not freedom for the thought you love. It's
freedom for the thought you hate the most."
--- Larry Flynt
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at