El dl., 19 oct. 2020, 19.58, Xavi Ivars <xavi.iv...@gmail.com> va escriure:

> Well, that's only "part" of the corpus... and for the Europarl, that part
> of corpus was not left "as is" after Apertium, but also postedited.
>

Wow! Did you postedited the whole Europarl corpus?! No matter if you used
Apertium or not, it's clear that you did tons of work. If it is explained
somewhere how Softcatalà did the work, with how much resources (time,
volunteers, money), please let us know. It has to be an excellent test case
to show wether a (real) under-resourced language can or cannot reach the
stuff needed for neural translation.
And again, congrats!
Hèctor


> The talk was specifically about eng-cat, and in that case, for the NMT
> model, Apertium was not involved.
> --
> < Xavi Ivars >
> < http://xavi.ivars.me >
> _______________________________________________
> Apertium-stuff mailing list
> Apertium-stuff@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/apertium-stuff
>
_______________________________________________
Apertium-stuff mailing list
Apertium-stuff@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/apertium-stuff

Reply via email to