Link grammar is equivalent to pregroup grammar , it's not a very
restrictive formalism if one throws out the hand-coded dictionaries as
we are now...

Sure I will look at what Pissanetsky has done...

On Tue, Feb 19, 2019 at 6:40 AM Rob Freeman <chaotic.langu...@gmail.com> wrote:
>
> Ben,
>
> That's what I thought. You're still working with Link Grammar.
>
> But since last year working on informing your links with stats from deep-NN 
> type, learned, embedding vector based predictive models? You're trying to 
> span the weakness of each formalism with the strengths of the other??
>
> There's a lot to say about all of that.
>
> Your grammar will be learned, with only the resolution you bake in from the 
> beginning. Your embedding vectors will be learned, and the dependency 
> decisions they can inform on learned, and thus finite, too. Plus you need to 
> keep two formalisms and marry them together... Large teams for all of that...
>
> On the plus side large teams have already been working on those formalisms 
> for decades. An asymptote plots the final failure of learning based methods, 
> but after decades their development is already far down whatever asymptote, 
> so you start right at the top of the tallest tree. No-one will complain about 
> your performance, because it is all anyone achieves.
>
> So, state-of-the-art. But complex, and doomed to asymptotic failure as ever 
> more comprehensive learning, ever more definitively fails, to capture every 
> Zipf long tail.
>
> Me, it's simple.
>
> You make the embedding vectors generative by substituting them into each 
> other. Infinite patterns. But the patterns are all meaningful, because the 
> substitution is meaningful (it's the very basis of all embedding vectors.) 
> You get hierarchy, with all that implies about dependency, grammar, for free, 
> as a natural consequence of the substitution process (it's non-associative.)
>
> And I now think the "substituting them into each other" step may be as simple 
> as setting a network of observed sequences oscillating.
>
> As you say 2013 is a long time ago. When I was pitching embedded vector 
> models in 2013 (let alone 2000), they were not the mainstream. Now they are.
>
> If you ask me whether I feel vindicated, the answer is yes.
>
> But vindication is hollow. We still don't have what I was also pitching back 
> then: vector recombination to generate new, meaningful, patterns, rather than 
> learn patterns.
>
> No large teams working on this yet, so it is still crude. In particular it 
> probably requires parallel hardware.
>
> Anyway, if you don't want to try this pattern creation idea for language, I 
> suggest you look at what Pissanetzky has done. That is more readily 
> interpretable in terms of vision. For vision the generative aspect is not so 
> obvious. I'm not sure Pissanetzky realizes his permutation "invariants" will 
> need to be constantly generated too. But by using permutation as his base, 
> the machinery is all there. Permutation is a generative process.
>
> -Rob
>
> On Mon, Feb 18, 2019 at 9:26 PM Ben Goertzel <b...@goertzel.org> wrote:
>>
>> 2013 seems an insanely long time ago ;) ...  we started with these ideas
>>
>> https://arxiv.org/abs/1401.3372
>>
>> https://arxiv.org/abs/1703.04368
>>
>> but have gone some way since... last summer's partial update was
>>
>> https://www.youtube.com/watch?v=ABvopAfc3jY
>>
>> http://agi-conf.org/2018/wp-content/uploads/2018/08/UnsupervisedLanguageLearningAGI2018.pdf
>>
>> But since last summer we have onboarded a new team that does deep-NN
>> language modeling and we are experimenting with using the output of
>> deep-NN predictive models to guide syntactic parsing and semantic
>> interpretation in OpenCog...
>>
>> -- Ben
>
> Artificial General Intelligence List / AGI / see discussions + participants + 
> delivery options Permalink



-- 
Ben Goertzel, PhD
http://goertzel.org

"Listen: This world is the lunatic's sphere,  /  Don't always agree
it's real.  /  Even with my feet upon it / And the postman knowing my
door / My address is somewhere else." -- Hafiz

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T581199cf280badd7-Mecd28bec3c3172299bc83d14
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to