On 28/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:

>  You must first define its existing skills, then define the new challenge
>  with some degree of precision - then explain the principles by which it will
>  extend its skills. It's those principles of extension/generalization that
>  are the be-all and end-all, (and NOT btw, as you suggest, any helpful info
>  that the robot will receive - that,sir, is cheating - it has to work these
>  things out for itself - although perhaps it could *ask* for info).
>

Why is that cheating? Would you never give instructions to a child
about what to do? Taking instuctions is something that all
intelligences need to be able to do, but it should be attempted to be
minimised. I'm not saying it should take instructions unquestioningly
either, ideally it should figure out whether the instructions you give
are any use for it.

  Will Pearson

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to