Yeah, good that you bring that up! Linas and I have read Coecke's papers on this stuff and discussed them a few times.... Indeed we are well positioned to explore these sorts of ideas computationally...
I suppose the value of the morphism you cite in a practical context would be: If patterns are mined among the syntactic structures of sentences, they can be mapped into the semantic domain... and vice versa.... So e.g. if we find X+Y is roughly equal to Z in the domain of semantic vectors, we can map this back into relations between the syntactic structures corresponding to X, Y and Z ... and this mapping may be used to adjust the probabilities of "grammar rules" or suggest new grammar rules. So in this way semantic patterns could be morphically mapped back to suggest syntactic patterns... -- Ben On Sun, Apr 2, 2017 at 8:01 PM, Jesús López <[email protected]> wrote: > Dear Dr. Goertzel and contributors, > > > You could also enrich the distributional ideas giving support to > compositionality in other way. In your arxiv:1703.04368 you link a pregroup > grammar parse tree of a sentence to a morphism in a symmetric monoidal > category. In work from Coecke, Clark and others a categorial grammar parse > tree is associated to a morphism in the category of linear maps which is > monoidal with the good old linear algebra tensor product. This morphism is a > tensor network that corresponds naturally with the categorial grammar parse > tree, where ground types such as nouns correspond to vectors obtained by a > distributional method such as word2vec and compound types of words such as > verbs correspond to higher rank tensors. That’s why they call it DisCoCat > (distributional, compositional, categorical) model. While theoretically nice > I think that computationally is still work in progress from the point of > view of getting hands on and start coding, though. > > > You can browse some slides of talks of Stephen Clark on this here: > https://sites.google.com/site/stephenclark609/talks > > > Warm regards, Jesus Lopez. > > > > On Sunday, 26 March 2017 18:44:10 UTC+2, Ben Goertzel wrote: >> >> Linas, >> >> I thought a bit about how to use a modified version of the word2vec >> idea in our language learning pipeline... >> >> I'm thinking about the Skip-gram model of word2vec, as summarized >> informally e.g. here >> >> http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/ >> >> Following up the suggestion you made in Addis in our chat with >> Masresha, I'm thinking to replace the "adjacent word-pairs" used in >> word2vec with "word-pairs that are adjacent in the parse tree" (where >> e.g. the parse tree may be the max-weight spanning tree in our >> language learning algorithm).... >> >> This would still produce a vector just like word2vec does, via the >> hidden layer of the NN ... but the vector would likely be more >> meaningful than a typical word2vec vector... >> >> What would the purpose of this be, in the context of our language >> learning algorithm? The purpose would be that clustering should work >> better on the word2vec vectors than on the raw-er data regarding "word >> co-occurrence in parse trees." At least, that seems plausible, since >> clustering on word2vec vectors generally works better than on >> co-occurrence vectors >> >> This would be something that Masresha or someone else in Addis could >> work on, I think... >> >> We can discuss at the office this week... >> >> ben >> >> >> -- >> Ben Goertzel, PhD >> http://goertzel.org >> >> “Our first mothers and fathers … were endowed with intelligence; they >> saw and instantly they could see far … they succeeded in knowing all >> that there is in the world. When they looked, instantly they saw all >> around them, and they contemplated in turn the arch of heaven and the >> round face of the earth. … Great was their wisdom …. They were able to >> know all.... >> >> But the Creator and the Maker did not hear this with pleasure. … ‘Are >> they not by nature simple creatures of our making? Must they also be >> gods? … What if they do not reproduce and multiply?’ >> >> Then the Heart of Heaven blew mist into their eyes, which clouded >> their sight as when a mirror is breathed upon. Their eyes were covered >> and they could see only what was close, only that was clear to them.” >> >> — Popol Vuh (holy book of the ancient Mayas) > > -- > You received this message because you are subscribed to the Google Groups > "opencog" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > To post to this group, send email to [email protected]. > Visit this group at https://groups.google.com/group/opencog. > To view this discussion on the web visit > https://groups.google.com/d/msgid/opencog/fececc5f-f40e-4cc0-8d0f-9361c5750265%40googlegroups.com. > > For more options, visit https://groups.google.com/d/optout. -- Ben Goertzel, PhD http://goertzel.org "I am God! I am nothing, I'm play, I am freedom, I am life. I am the boundary, I am the peak." -- Alexander Scriabin -- You received this message because you are subscribed to the Google Groups "opencog" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/opencog. To view this discussion on the web visit https://groups.google.com/d/msgid/opencog/CACYTDBdr_5j%3DYXgMb0W3gT7Bo%3DWcBvtOUaJmXdrPftrOJGnm5A%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
