Do you mean "reordering model" in general? If so, it must be used unless
the preprocessor "completely got rid of local/global distortion". 

On the other hand, some specific reordering models such as Lexicalized
Rerordering Model in Moses may not be useful. Distance-based reordering
doesn't deal with global reordering well, rather it relocate some
phrases within a certain boundary.

Despite many researches have been suggested for global reordering, it is
a still unsolved problem..
-- 
Hwidong Na <[email protected]>
KLE lab, POSTECH, KOREA


2010-02-22 (월), 12:49 +0000, Carlos Henriquez:
> The reordering model tries to deal with global reordering rather than
> local reordering because the latter is generally solved with phrase
> generation.
> 
> So it depends on the language pair used. Cases like Catalan-Spanish
> may omit the use of reordering models because they are very much
> alike. English-Spanish or Chinese-English, on the other hand, consider
> the reordering model as a "must-be-used" component. Phrase generation
> will not be able to deal with reordering alone there.
> 
> You may read more on the Lexicalized Reordering Model here
> 
> http://www.statmt.org/moses/?n=Moses.AdvancedFeatures#ntoc1
> 
> or you may got to the Tillmann paper here 
> 
> http://portal.acm.org/citation.cfm?id=1613984.1614010
> 
> Which languages are you working on? 
>  
> --
> Carlos A. Henríquez Q.
> [email protected]
> 
> 
> 
> 
> ______________________________________________________________________
> De: Calia <[email protected]>
> Para: [email protected]
> Enviado: lun,22 febrero, 2010 05:36
> Asunto: [Moses-support] Is reordering model a "must-be-used" component
> to use?
> 
> 
> I wonder if I may exclude the reordering model during search.
> 
> Since I came up with my morpho-syntactical preprocessor to transform
> source language
> 
> in both training-time and run-time, and reordering model deals with
> the local reordering or words
> 
> during translation, there doesn't seem to be the need for using it if
> the preprocessor
> 
> completely got rid of local/global distortion, even for language
> model?
> 
> Is my hypothesis justified?
> 
> Actually, from my subjective evaluation, using phrase-table and lm
> alone shows better result than
> 
> when I use them with reordering model. But I am not sure of my theory.
> 
> p.s. I've purchased Koehn's new book on SMT which I find gigantically
> lovely! Thanks 
> 
> _______________________________________________
> Moses-support mailing list
> [email protected]
> http://mailman.mit.edu/mailman/listinfo/moses-support





_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to