Andres,

I just happened to see your post. The title of Yuret’s paper is: Lexical 
Attraction Models of Language

Also I have had luck with removing cookies on some websites to reset download 
limits so you can try that as well — only works for certain websites though.

—matt

> On May 6, 2019, at 1:36 PM, Andres Suarez <[email protected]> wrote:
> 
> 
> 
> On Mon, May 6, 2019, 13:30 Ben Goertzel <[email protected] 
> <mailto:[email protected]>> wrote:
> 
> 
> On Sun, May 5, 2019 at 10:15 PM Anton Kolonin @ Gmail <[email protected] 
> <mailto:[email protected]>> wrote:
> Hi Linas, I am re-reading your emails and updating our TODO issues from some 
> of them.
> 
> Not sure about this one:
> 
> >Did Deniz Yuret falsify his thesis data? He got better than 80% accuracy; we 
> >should too.
> 
> I don't recall Deniz Yuret comparing MST-parses to LG-English-grammar-parses.
> 
> 
> Linas: Where does the > 80% figure come from?
> 
> This paper of Yuret's 
> 
> http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.129.5016&rep=rep1&type=pdf
>  
> <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.129.5016&rep=rep1&type=pdf>
> 
> cites 53% accuracy compared against "dependency parses derived from 
> dependency-grammar-izing Penn Treebank parses on WSJ text" ....   It was 
> written after his PhD thesis.  Is there more recent work by Yuret that gives 
> massively better results?  If so I haven't seen it.
> 
> Ben, what's the title of this paper? The link gives me a message about 
> exceeding my daily paper download limit (??). I'm not sure how they evaluate 
> there, but just wanted to mention that Yuret's thesis evaluates only links 
> between "content-words", so that should indeed be considered when comparing 
> against the ULL results.
> 
> 
> Spitkovsky's more recent work on unsupervised grammar induction seems to have 
> gotten better statistics than this, but it used radically different methods.
> 
> 
> 
> a) Seemingly "worse than LG-English" "sequential parses" provide seemingly 
> better "LG grammar" - that may be some mistake, so we will have to 
> double-check this.
> Anton, Sergey, the attached image comes from another Yuret paper ( 
> http://www.denizyuret.com/2006/06/dependency-parsing-as-classification.html?m=1
>  
> <http://www.denizyuret.com/2006/06/dependency-parsing-as-classification.html?m=1>
>  ). Although English is missing, it could help explain why sequential parses 
> score well against MST-parses, related to our discussion during today's call.
> 
> 
> 
> Anton -- Have you looked at the inferred grammar for this case, to see how 
> much sense it makes conceptually?
> 
> Using sequential parses is basically just using co-occurrence rather than 
> syntactic information
> 
> I wonder what would happen if you used *both* the sequential parse *and* some 
> fancier hierarchical parse as inputs to clustering and grammar learning?   
> I.e. don't throw out the information of simple before-and-after 
> co-occurrence, but augment it with information from the statistically 
> inferred dependency parse tree...
> 
> 
> 
> 
> -- Ben
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "lang-learn" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected] 
> <mailto:[email protected]>.
> To post to this group, send email to [email protected] 
> <mailto:[email protected]>.
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/lang-learn/CACYTDBeAMobEMwiWUL8xbTRZFsLiJ0gtLQJi%3D4xo60rJyX2y9A%40mail.gmail.com
>  
> <https://groups.google.com/d/msgid/lang-learn/CACYTDBeAMobEMwiWUL8xbTRZFsLiJ0gtLQJi%3D4xo60rJyX2y9A%40mail.gmail.com?utm_medium=email&utm_source=footer>.
> For more options, visit https://groups.google.com/d/optout 
> <https://groups.google.com/d/optout>.
> 
> a.
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "opencog" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected] 
> <mailto:[email protected]>.
> To post to this group, send email to [email protected] 
> <mailto:[email protected]>.
> Visit this group at https://groups.google.com/group/opencog 
> <https://groups.google.com/group/opencog>.
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/opencog/CAOp5hNGY_A42Uf72UG0A9mZ7LosFiNrioaLyk3J%3DaKRDx1tW-Q%40mail.gmail.com
>  
> <https://groups.google.com/d/msgid/opencog/CAOp5hNGY_A42Uf72UG0A9mZ7LosFiNrioaLyk3J%3DaKRDx1tW-Q%40mail.gmail.com?utm_medium=email&utm_source=footer>.
> For more options, visit https://groups.google.com/d/optout 
> <https://groups.google.com/d/optout>.
> <Screenshot_20190507-031931.jpg>

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/5BCD41CA-7926-4778-94E9-CBEF70739721%40gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to