Hi Sriram The low weight is a bit suspicious. You should check that your first language model was trained correctly.
cheers - Barry On Wednesday 31 August 2011 05:58, Philipp Koehn wrote: > Hi, > > the error stems from the low weight given to one of the > language models (4.78862e-05) which is not properly > parsed by the script scripts/ems/support/interpolate-lm.perl > which pattern matches weights with /best lambda \(([\d\. ]+)\)/ > > The simple solution here is to remove the first language model > since it does not add any value. > > -phi > > On Thu, Jul 28, 2011 at 10:12 AM, Sriram venkatapathy > > <[email protected]> wrote: > > Hello, > > > > I am using multiple language models for my experiments, and am using > > Interpolated-LM to optimize the perplexity on the tuning set. > > > > It had worked well when I used a particular tuning set (set A) . But > > when I used a different one (set B), it crashed. Here is the log of > > the error: > > > > ----- > > Executing: /home/svenkata/tools/srilm/bin/i686//compute-best-mix > > /tmp/dhDarpsIUB/iplm.32687.0 /tmp/dhDarpsIUB/iplm.32687.1 > > /tmp/dhDarpsIUB/iplm.32687.2 /tmp/dhDarpsIUB/iplm.32687.3 > > ERROR: computing lambdas failed: iteration 1, lambda = (0.5 0.166667 > > 0.166667 0.166667), ppl = 28.1782 > > iteration 2, lambda = (0.149855 0.0787999 0.349265 0.42208), ppl = > > 15.8558 iteration 3, lambda = (0.0425529 0.0338758 0.394179 0.529393), > > ppl = 14.0577 iteration 4, lambda = (0.0133561 0.016913 0.398085 > > 0.571646), ppl = 13.6439 iteration 5, lambda = (0.00464723 0.0099183 > > 0.394268 0.591166), ppl = 13.5249 iteration 6, lambda = (0.00174248 > > 0.00659117 0.390115 0.601551), ppl = 13.4844 iteration 7, lambda = > > (0.000684944 0.00477971 0.386928 0.607607), ppl = 13.4687 iteration 8, > > lambda = (0.000277308 0.00367993 0.384704 0.611339), ppl = 13.4619 > > iteration 9, lambda = (0.000114464 0.00295462 0.383217 0.613713), ppl = > > 13.4586 iteration 10, lambda = (4.78862e-05 0.00244556 0.382255 > > 0.615251), ppl = 13.4569 iteration 11, lambda = (2.02318e-05 0.00207096 > > 0.381653 0.616256), ppl = 13.456 1422 non-oov words, best lambda > > (8.61264e-06 0.00178499 0.381292 0.616914) ----- > > > > Any suggestions about what the issue might be ? The problem may not be > > with the tuning 'set B' itself, as I had used it with a different > > training conditions and there was no problem. > > > > Thanks ! > > Sriram > > _______________________________________________ > > Moses-support mailing list > > [email protected] > > http://mailman.mit.edu/mailman/listinfo/moses-support > > _______________________________________________ > Moses-support mailing list > [email protected] > http://mailman.mit.edu/mailman/listinfo/moses-support -- The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. _______________________________________________ Moses-support mailing list [email protected] http://mailman.mit.edu/mailman/listinfo/moses-support
