Hello Raj,
can you please clarify if you tried to train a monolingual LM
(NeuralLM), a bilingual LM (BilingualNPLM), or both? Our previous
experiences with BilingualNPLM are mixed, and we observed improvements
for some tasks and language pairs, but not for others. See for instance:
Alexandra Birch, Matthias Huck, Nadir Durrani, Nikolay Bogoychev and
Philipp Koehn. 2014. Edinburgh SLT and MT System Description for the
IWSLT 2014 Evaluation. Proceedings of IWSLT 2014.
To help debugging, you can check the scores in the n-best lists of the
tuning runs. If the NPLM features give much higher costs than KenLM
(trained on the same data), this can indicate that something went wrong
during training.
best wishes,
Rico
On 06.07.2015 14:29, Raj Dabre wrote:
Dear all,
I have checked out the latest version of moses and nplm and compiled
moses successfully with the --with-nplm option.
I got a ton of warnings during compilation but in the end it all
worked out and all the desired binaries were created. Simply executing
the moses binary told me the the BilingualNPLM and NeuralLM features
were available.
I trained an NPLM model based on the instructions here:
http://www.statmt.org/moses/?n=FactoredTraining.BuildingLanguageModel#ntoc33
The corpus size I used was about 600k lines (for Chinese-Japanese;
Target is Japanese)
I then integrated the resultant language model (after 10 iterations)
into the decoding process by moses.ini
I initiated tuning (standard parameters) and I got no errors, which
means that the neural language model (NPLM) was recognized and queried
appropriately.
I also ran tuning without a language model.
The strange thing is that the tuning and test BLEU scores for both
these cases are almost the same. I checked the weights and saw that
the LM was assigned a very low weight.
On the other hand when I used KENLM on the same data.... I had
comparatively higher BLEU scores.
Am I missing something? Am I using the NeuralLM in an incorrect way?
Thanks in advance.
--
Raj Dabre.
Doctoral Student,
Graduate School of Informatics,
Kyoto University.
CSE MTech, IITB., 2011-2014
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support