Did you smooth the probabilities in the regular phrase table? this usually
adds around 0.3 BLEU.


On 15 February 2013 10:32, Mirkin, Shachar <[email protected]>wrote:

>  Hi,****
>
> ** **
>
> I’ve been trying for a while to use incremental training, but I’m running
> into quite a few issues. ****
>
> The updates seem to be working fine, as evident from updates containing
> OOVs, but the performance when using the dynamic suffix array is much
> inferior (several BLEU points) in comparison to using the regular Moses
> server on the same dataset and with the same model.****
>
> In both cases I used a model trained with incremental GIZA, so this seems
> like an effect of the suffix array rather than a different alignment model.
> ****
>
> I was not making any updates to the server in these experiments.****
>
> Could this be a result of a limit on the memory that is used when the
> suffix array is loaded (my corpus contains 1M sentence pairs)? Any other
> ideas for the cause of the decrease in performance?****
>
> ** **
>
> Concerning updates, is there a way to change the incremental GIZA
> parameters, such as the interpolation parameter gamma?****
>
> ** **
>
> Lastly, when my ini file for loading the server in the suffix array mode
> contains the complete reordering table of the trained model (rather than
> the filtered one for the test test), it takes forever to load. Any
> suggestions?****
>
> ** **
>
> Thanks a lot,****
>
> ** **
>
> Shachar****
>
> ** **
>
> ** **
>
> _______________________________________________
> Moses-support mailing list
> [email protected]
> http://mailman.mit.edu/mailman/listinfo/moses-support
>
>


-- 
Hieu Hoang
Research Associate
University of Edinburgh
http://www.hoang.co.uk
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to