On Mon, Sep 29, 2008 at 9:38 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote:

> It seems to me the main limitation is that the language model has to be 
> described formally in Cycl, as a lexicon and rules for parsing and 
> disambiguation. There seems to be no mechanism for learning natural language 
> by example. For example, if Cyc receives a sentence it cannot parse, or is 
> ambiguous, or has a word not in its vocabulary or used in a different way, 
> then there is no mechanism to update the model, which is something humans 
> easily do. Given the complexity of English, I think this is a serious 
> limitation with no easy solution.
**************************************

I think building the language model in Cycl is actually the right
move.  An AGI should build its language model in a logical form that
is self-same with the logical form it reasons with.  That's the only
way that the AGI can learn language robustly and perform sophisticated
language-related reasoning, including meta-reasoning.

I can do this in G_0, but in the bootstrap stage I can also use a
simple (brittle) NL interface to save some work.  This simple NL
interface would be jettisoned later when the AGI learns NL using the
ultimate way.

YKY


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to