Mark Waser wrote:
 The critical point that most people miss -- and what is really
 important for this list (and why people shouldn't blindly dismiss
 Searle) is that it is *intentionality* that defines "understanding".
 If a system has goals/intentions and it's actions are modified by the
 external world (i.e. it is grounded), then, to the extent to which
 it's actions are *effectively* modified (as judged in relation to
 it's intentions) is the extent to which it "understands".  The most
 important feature of an AGI is that it has goals and that it modifies
 it's behavior (and learns) in order to reach them.  The Chinese Room
 is incapable of these behaviors since it has no desires.

I think this is an excellent point, so long as you're careful to define "intention" simply in terms of goals that the system is attempting to satisfy/maximize, and not in terms of conscious desires. As you point out, the former provides a context in which to define understanding and to measure it. The latter leads off into further undefined terms and concepts -- I mention this rather than just agreeing outright mainly because of your use of the word "desire" in the last sentence, which /could/ be interpreted anthropomorphically.



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=108809214-a0d121
Powered by Listbox: http://www.listbox.com

Reply via email to