On 5/1/07, Jiri Jelinek <[EMAIL PROTECTED]> wrote:

Our AI = our tool = should work for us = will get high level goals (+
urgency info and constraints) from us. Allowing other sources of high level
goals = potentially asking for conflicts. For sub-goals, AI can go with
reasoning.


Yep. Preprogrammed constraints are then built-in biases to cut down the
search space. ("Why do I assume space is more likely to have three
dimensions than four or five? Because I'm programmed to.")

Of course there's no reason to give these the subjective quality of human
emotions.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to