I think here we need to consider A. Maslow's hierarchy of needs. That an AGI won't have the same needs as a human is, I suppose, obvious, but I think it's still true that it will have a "hierarchy" (which isn't strictly a hierarchy). I.e., it will have a large set of motives, and which it is seeking to satisfy at any moment will alter as the satisfaction of the previous most urgent motive changes.
I agree with all of this.
It it were a human we could say that breathing was the most urgent need...but usually it's so well satisfied that we don't even think about it. Motives, then, will have satisficing as their aim. Only aberrant mental functions will attempt to increase the satisfying of some particular goal without limit. (Note that some drives in humans seem to occasionally go into that "satisfy increasingly without limit" mode, like quest for wealth or power, but in most sane people these are reined in. This seems to indicate that there is a real danger here...and also that it can be avoided.)
I agree this except that I believe that humans *frequently* aim to optimize rather than satisfy (frequently to their detriment -- in terms of happiness as well as in the real costs of performing the search past a simple satisfaction point).
Also, quest for pleasure (a.k.a. addiction) is also distressingly frequent in humans.
Do you think that any of this contradicts what I've written thus far? I don't immediately see any contradictions.
------------------------------------------- agi Archives: http://www.listbox.com/member/archive/303/=now RSS Feed: http://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b Powered by Listbox: http://www.listbox.com
