On Mon, Feb 4, 2019 at 4:38 PM Stefan Reich via AGI <[email protected]> wrote:
> Just read it again (tried). I'd like to understand the paper, the > introduction is quite intriguing, but then it just goes over my head... > > Link Grammars sound interesting. However, I increasingly find myself > dismissing any knowledge structure that is not directly based on natural > language. > Umm, perhaps you are not aware that Link Grammar is both a theory of natural language, and a natural language parser. I believe it's the most accurate one in the world, for English; it also does Russian, and has demos for another half-dozen languages (Persian, Arabic, German, Kazhaki, Vietnamese, Lithuanian, Turkish, Hebrew) > I want to get rid of all the baggage of maintaining specialized language > and actually use English for everything. So my proposal for grammar rules > is something like this: > > "Subject predicate object" is a good sentence structure if subject and > predicate match in case. > A noun alone is not a sentence. > The word "a" + an adjective + a noun makes a new noun. > A question should end with a question mark. > Well, of course that is not how English actually works. > Why not make a logic engine that takes rules like these and reasons about them? (The problem of expressing English grammar in English is obviously recursive, but it's not so bad, the rules can be parsed with fairly basic functions.) That is what we are trying to do with the opencog atomspace. We now have a generic logic engine that reasons about generic rules, and it mostly kind-of works. Since it has been completed, I have now realized that there are better ways of doing this, but .. there's some arcane fiddly details. FWIW, you can think of the link-grammar "disjuncts" as a kind of "rule". Parsing is the assembly of rules. The general mental picture is that of assembling puzzle pieces. Here is a less technical explanation of the idea: https://github.com/opencog/atomspace/raw/master/opencog/sheaf/docs/sheaves.pdf I thought I did some decent job of explaining things, but I get the impression I over-simplified, because the general feedback was that it's "trivial". Which it isn't. But whatever. --linas > On Mon, 4 Feb 2019 at 23:21, Linas Vepstas <[email protected]> wrote: > >> >> >> On Mon, Feb 4, 2019 at 6:02 AM Stefan Reich via AGI <[email protected]> >> wrote: >> >>> > Many commentators here agreed (over time) how agi development >>> requires a radically-different approach to all other computational >>> endeavors to date. >>> >>> Not sure what that means. A really good NLU will go a very long way, and >>> then we'll have to find a new "magic learner" module that replaces neural >>> networks, both for image/audio recognition and learning logic. I suggest >>> evolutionary algorithms. >>> >> >> Here's my "magic learner" proposal. Actually, it is much less than that; >> it just shows how symbolic computing and neural net computing are two sides >> of the same coin. The idea is that once you see the correspondence, then >> you have a clear path to the kind of symbolic computing that lots of people >> want to do, and a way of uncloaking the "black box" aspects of neural nets. >> >> >> https://github.com/opencog/opencog/raw/master/opencog/nlp/learn/learn-lang-diary/skippy.pdf >> >> FYI, so far, everyone I have shown this to has replied by saying "I read >> it but I skipped the math", which is an odd thing to do, since its >> essentially a math paper. The whole point is that, if you want to >> understand how neural nets and symbolic learning can be placed on the same >> footing, then you have to understand how both systems work, and "skipping >> the math" is equivalent to "skipping the actual explanation". >> >> (I used to have a non-technical way of explaining this, but everyone who >> read that was underwhelmed.) >> >> --linas >> >> -- >> cassette tapes - analog TV - film cameras - you >> > > > -- > Stefan Reich > BotCompany.de // Java-based operating systems > *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + delivery > options <https://agi.topicbox.com/groups/agi/subscription> Permalink > <https://agi.topicbox.com/groups/agi/Ta6fce6a7b640886a-M1dbb7f081eb5880bd007b6a5> > -- cassette tapes - analog TV - film cameras - you ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Ta6fce6a7b640886a-M165582f3f72fc3638d701b67 Delivery options: https://agi.topicbox.com/groups/agi/subscription
