Hi,

I have been using interpolated language models built using IRSTLM (5.70.04)
and the latest version of Moses downloaded from gitbub (i downloaded
yesterday)

However, as the decoding progresses, i see a rapid increase in memory
consumption.
I was trying to translate 1000 sentences and the memory consumption reaches
about 13Gb after translating about 200 sentences...

Subsequently the decoding process is also very slow...  (Takes about 18-20
seconds to translate sentences longer than 10 words)

I observed this on both multi-threaded and single threaded version of the
Moses decoder...

Is there a memory leak somewhere, or am i  doing something wrong...

Following is my Moses.ini file entry for interpolated LMs...

# language models: type(srilm/irstlm), factors, order, file
[lmodel-file]
1 0 5
/home/pbanerjee/smt-work/incremental_LM_Exp/mt/l_models/interp.wt.final

my interpolated language model configuration file (interp.wt.final) is

LMINTERPOLATION 2
0.439053 /home/pbanerjee/smt-work/incremental_LM_Exp/mt/l_models/forum.lm
0.560947 /home/pbanerjee/smt-work/incremental_LM_Exp/mt/l_models/tm.lm

Also with the latest version of IRSTLM, the documentation(
http://sourceforge.net/apps/mediawiki/irstlm/index.php?title=LM_interpolation)
requires the initial configuration file for interpolate-lm to be in the
following format:

LMINTERPOLATION 3
0.3 lm-file1
0.3 lm-file2
0.4 lm-file3

But when passing such a config to interpolate-lm it complains:

Reading interp.wt.init...
Wrong input format.

It only works when i drop the 'LMINTERPOLATION' header from the config
file...

Thanks and Regards,

Pratyush Banerjee
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to