Ben:If an intelligent system has a goal G which is time-consuming or difficult 
to achieve ...
it may then synthesize another goal G1 which is easier to achieve
We then have the uncertain syllogism

Achieving G implies reward
G1 is similar to G

Ben,

The be-all and end-all here though, I presume is "similarity". Is it a logic-al 
concept?  Finding similarities - rough likenesses as opposed to rational, 
precise, logicomathematical commonalities - is actually, I would argue, a 
process of imagination and (though I can't find a ready term) physical/embodied 
improvisation. Hence rational, logical, computing approaches have failed to 
produce any new (in the normal sense of "surprising")  metaphors or analogies 
or be creative.

Maybe you could give an example of what you mean by similarity




-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to