Re: [Moses-support] Blingual neural lm, log-likelihood: -nan

2015-09-21 Thread Rico Sennrich
08:58:16 +0100 To:Nikolay Bogoychev ,jian zhang Cc:moses-support@mit.edu Subject:Re: [Moses-support] Blingual neural lm, log-likelihood: -nan >The University of Edinburgh is a charitable body, registered in >Scotland, with registration number SC005336. > >__

Re: [Moses-support] Blingual neural lm, log-likelihood: -nan

2015-09-21 Thread Barry Haddow
Hi Jian You could also try using dropout. Adding something like --dropout 0.8 --input_dropout 0.9 --null_index 1 to nplm training can help - look at your vocabulary file to see what the null index should be set to. This works with the Moses version of nplm, cheers - Barry On 21/09/15 08:45,

Re: [Moses-support] Blingual neural lm, log-likelihood: -nan

2015-09-21 Thread Nikolay Bogoychev
Hey Jian, I have encountered this problem with nplm myself and couldn't really find a solution that works every time. Basically what happens is that there is a token that occurs very frequently on the same position and it's weights become huge and eventually not a number which propagates to the r

[Moses-support] Blingual neural lm, log-likelihood: -nan

2015-09-19 Thread jian zhang
Hi all, I got Epoch Current learning rate: 1 Training minibatches: Validation log-likelihood: -nan perplexity: nan during bilingual neural lm training. I use command: /home/user/tools/nplm-master-rsennrich/src/trainNeuralNetwork --train_file work_dir/blm/train.numberized --n