--- On Wed, 9/17/08, Abram Demski <[EMAIL PROTECTED]> wrote: > Most people on this list should know about at least 3 > uncertain logics > claiming to be AGI-grade (or close): > > --Pie Wang's NARS > --Ben Goertzel's PLN > --YKY's recent hybrid logic proposal > > It seems worthwhile to stop and take a look at what > criteria such > logics should be judged by. So, I'm wondering: what > features would > people on this list like to see?
How about testing in the applications where they would actually be used, perhaps on a small scale. For example, how would these logics be used in a language translation program, where the problem is to convert a natural language sentence into its structured representation and convert it back in another language. How easy is it to populate the database with the gigabyte or so of common sense knowledge needed to provide the context in which natural language statements are interpreted? (Cyc proved it is very hard). For a lot of the problems where we actually use structured data, a relational database works pretty well. However it is nice to see proposals that deal with inconsistencies in the database better than just reporting an error. -- Matt Mahoney, [EMAIL PROTECTED] ------------------------------------------- agi Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69 Powered by Listbox: http://www.listbox.com