Oleg,

If you’re looking for rule based approaches, I would also encourage you to
look at explicitly rule based systems like Apertium

On Fri, Mar 18, 2022 at 2:42 AM Oleg Parashchenko <o...@uucode.com> wrote:

> Hello Hieu,
>
> On Thu, 17 Mar 2022 23:09:25 -0700
> Hieu Hoang <hieuho...@gmail.com> wrote:
>
> > the training does add glue rules for non-syntax models but you're using
> > a variant with syntax where you have to do it yourself.
>
> I didn't want to use a variant with syntax, I want to try "hierarchical
> phrase-based: no linguistic syntax". Please suggest what I'm doing wrong
> with my corpus and training parameters.
>
> Here is the corpus:
>
> ```
> <tree label="N">nhoj</tree>
> <tree label="N">yram</tree>
> <tree label="V">olleh</tree>
> <tree label="V">eyb</tree>
> <tree label="VP"><tree label="V">olleh</tree> <tree
> label="N">hnoj</tree></tree>
> ```
>
> Here is the training step:
>
> ```
> opt/moses/scripts/training/train-model.perl \
>   -corpus corpus \
>   -f ne -e en \
>   -lm 0:2:$(pwd)/lm.en \
>   -alignment grow-diag-final-and \
>   -external-bin-dir /opt/tools \
>   -mgiza \
>   -hierarchical \
> ```
>
> >
> > Download the small sample model and look in the tree-to-tree example to
> > see how it's done
> >
> >     http://www.statmt.org/moses/download/sample-models.tgz
>
> Unfortunately, the samples contain only the resulting models, without
> the data for the training step.
>
> > Also, I hope you realise that these are research models. They don't
> > produce good results in practise and have largely been overtaken by
> > neural models
>
> That's ok, for now I'm experimenting with rule-based approaches for low
> resource languages.
>
> Regards,
> Oleg
> _______________________________________________
> Moses-support mailing list
> Moses-support@mit.edu
> https://mailman.mit.edu/mailman/listinfo/moses-support
>
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
https://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to