Firstly, you should write to the CG mailing list instead of CC'ing us all -
https://groups.google.com/forum/#!forum/constraint-grammar /
[email protected] - I have done so with this reply.

Anyway, I have been saying this for a long time. The past decade of machine
learning has simply approached the hand-written method. E.g., when Google
published
https://research.googleblog.com/2016/05/announcing-syntaxnet-worlds-most.html
my immediate comment was: Their induced rules smell an awful lot like
constraint grammar, just expressed in vector fields.

Other papers in the field even explicitly say that their models look a lot
more like classic systems, with separate source language analysis,
transfer, and target language generation. And this holds for any
text-to-text transformation, not just translation. So I am not the least
bit surprised that more advanced models look more and more like rule-based
systems.

-- Tino Didriksen


On Wed, 26 Feb 2020 at 14:26, Tiedemann, Jörg <[email protected]>
wrote:

> Dear CG community,
>
>
> I am reaching out to you because we have the idea to follow-up on Anssi
> Yli-Jyrä’s ideas on comparing CG to transformer models to see whether there
> is some commonalities between expert-made linguistic grammars and learned
> neural language models. This is some kind of fascinating question and we
> would like to carry out some empirical studies to find possible
> correlations and patterns.
>
> It would be great to get an update about available CG resources to get
> started and it would also be interesting to hear whether anyone of you
> would be interested to even collaborate in that study. What I had in mind
> was to look into the disambiguation process done on real-world data using
> CG-based parsers and compare that with the activations triggered in trained
> neural language models.
>
> It would be excellent to know whether there are some (hopefully freely
> available) wide-coverage grammars and parsers available that we can study.
> Most likely, we need to look into high-resource languages (including
> Finnish( to also make proper comparisons to neural models but other
> scenarios are possible as well. Please, let me and Anssi know whether you
> have any suggestions. Thanks a ot!
>
>
> All the best,
> Jörg
>

-- 
You received this message because you are subscribed to the Google Groups 
"Constraint Grammar" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/constraint-grammar/CABnmVq6rpbp1-hDsk6VWLb5YOAXdWgu8UDcXwb4rzxrLCY592A%40mail.gmail.com.

Reply via email to