On Tue, Aug 26, 2008 at 8:09 AM, Terren Suydam <[EMAIL PROTECTED]> wrote:
> I know we've gotten a little off-track here from play, but the really
> interesting question I would pose to you non-embodied advocates is:
> how in the world will you motivate your creation?  I suppose that you
> won't. You'll just tell it what to do (specify its goals) and it will do it,
> because it has no autonomy at all. Am I guilty of anthropomorphizing
> if I say autonomy is important to intelligence?
>

This is fuzzy, mysterious and frustrating. Unless you *functionally*
explain what you mean by autonomy and embodiment, the conversation
degrades to a kind of meaningless philosophy that occupied some smart
people for thousands of years without any results.

-- 
Vladimir Nesov
[EMAIL PROTECTED]
http://causalityrelay.wordpress.com/


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to