> However, I think that the spontaneous emergence of complex concepts like
> prepositional ones from sensory inputs is not very practical, and will
> take an insane amount of compute time to occur, even though it's
> possible.
> 
> I think that, in practice, we'll need to use a combination of explicit
> teaching and spontaneous emergence...

I am aware of this problem, that compression alone is not efficient
enough. My idea is to somehow "steer" the compression to make use of
concepts that we already know, ie natural language ones. But I haven't
figured out how to do that. My main idea has been designing the "G" in
AGI first.

Maybe you can consider using the generic memory, and build a Bayesian
layer on top of that. Do you think that'll save some development time
/ be more efficient?

YKY
-- 
_______________________________________________
Find what you are looking for with the Lycos Yellow Pages
http://r.lycos.com/r/yp_emailfooter/http://yellowpages.lycos.com/default.asp?SRC=lycos10

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to