Hi --

I am following the instructions for Experiment.perl  

In order to run properly, EMS will require:

     *   A version of Moses along with SRILM,

But it sounds like that guide is outdated and SRILM is no longer mandatory? 
Perhaps I could help with updating some docs if there is a need.

I managed to recompile Moses with bjam but now the Moses decoder is crashing 
(despite successful compile). Do I need to recompile everything now that I have 
added SRILM support? I think bjam only recompiled a subset of files.

Here is the error message:

 ~/workspace/mosesdecoder/bin/moses -f 
Defined parameters (per moses.ini or switch):
config: /home/gary/workspace/working/binarised-model/moses.ini
distortion-limit: 6
feature: UnknownWordPenalty WordPenalty PhrasePenalty PhraseDictionaryCompact 
name=TranslationModel0 num-features=4 
input-factor=0 output-factor=0 LexicalReordering name=LexicalReordering0 
num-features=6 type=wbe-msd-bidirectional-fe-allff input-factor=0 
path=/home/gary/workspace/working/binarised-model/reordering-table Distortion 
KENLM name=LM0 factor=0 
path=/home/gary/workspace/lm/news-commentary-v8.fr-en.blm.en order=3
input-factors: 0
mapping: 0 T 0
threads: 4
weight: LexicalReordering0= 0.0930209 0.0295306 0.0229615 0.100688 0.0420028 
0.0654889 Distortion0= 0.0894002 LM0= 0.0857754 WordPenalty0= -0.131137 
PhrasePenalty0= 0.0983649 TranslationModel0= 0.0146617 0.0831309 0.0716015 
0.0722352 UnknownWordPenalty0= 1
FeatureFunction: UnknownWordPenalty0 start: 0 end: 0
FeatureFunction: WordPenalty0 start: 1 end: 1
FeatureFunction: PhrasePenalty0 start: 2 end: 2
line=PhraseDictionaryCompact name=TranslationModel0 num-features=4 
input-factor=0 output-factor=0
Exception: moses/FF/Factory.cpp:388 in void 
Moses::FeatureRegistry::Construct(const string&, const string&) threw 
UnknownFeatureException because `i == registry_.end()'.
Feature name PhraseDictionaryCompact is not registered.

Any thoughts on what went  wrong?

Thanks, Gary

Moses - FactoredTraining/EMS - Machine 
Introduction. The Experiment Management System (EMS), or Experiment.perl, for 
lack of a better name, makes it much easier to perform experiments with Moses.

From: Hieu Hoang <hieuho...@gmail.com>
Sent: Friday, August 4, 2017 11:33 PM
To: Gary Hess
Cc: moses-support@mit.edu
Subject: Re: [Moses-support] bjam compilation

Hieu Hoang

Moses Machine Translation cic – Supporting and enhancing 
The company provides support and enhancements to the Moses Statistical Machine 
Translation (SMT) toolkit for commercial users. The company has been set up as 
UK-based ...

On 4 August 2017 at 18:55, Gary Hess 
<garyhess...@hotmail.com<mailto:garyhess...@hotmail.com>> wrote:

I need to recompile Moses with SRILM because I want to use the EMS. I have 
already managed to compile SRILM separately.

You don't need SRILM unless you are using it for LM interpolation. For normal 
LM training and inference, KenLM is better in every way

Now I have a question about bjam: I see the "-j" option referenced 
(http://www.statmt.org/moses/?n=Development.GetStarted), e.g. "-j4", "-j12", 
"-j8". What does this option do and which value should I use?

This specify how many parallel compilation to perform. Usually, set this to the 
number of cores on the machine

I have SRILM in ~/workspace/srilm and Boost is installed in 
~/workspace/boost_1_60_0. Moses is in ~/workspace/mosesdecoder.



Moses - Development/GetStarted - Machine 
Getting Started with Moses. This section will show you how to install and build 
Moses, and how to use Moses to translate with some simple models.

From: moses-support-boun...@mit.edu<mailto:moses-support-boun...@mit.edu> 
<moses-support-boun...@mit.edu<mailto:moses-support-boun...@mit.edu>> on behalf 
of moses-support-requ...@mit.edu<mailto:moses-support-requ...@mit.edu> 
Sent: Thursday, August 3, 2017 6:00 PM
To: moses-support@mit.edu<mailto:moses-support@mit.edu>
Subject: Moses-support Digest, Vol 130, Issue 2

Send Moses-support mailing list submissions to

To subscribe or unsubscribe via the World Wide Web, visit
or, via email, send a message with subject or body 'help' to

You can reach the person managing the list at

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."

Today's Topics:

   1. Re: EMS Incremental training (Tom Hoar)


Message: 1
Date: Wed, 2 Aug 2017 23:37:42 +0700
From: Tom Hoar <tah...@pttools.net<mailto:tah...@pttools.net>>
Subject: Re: [Moses-support] EMS Incremental training
To: moses-support@mit.edu<mailto:moses-support@mit.edu>
Content-Type: text/plain; charset="utf-8"

Pavan, I saved this message from a while back. We follow this advise.

    From: Philipp Koehn
    Subject: Re: [Moses-support] Training vs Incremental Training
    To: Adel Khalifa
    Cc: moses-support@mit.edu<mailto:moses-support@mit.edu>


    there are various versions of incremental training, but full re-training
    from scratch will give better results, since incremental word alignment
    will not be as good as full word alignment.


On 8/2/2017 11:00 PM, 
moses-support-requ...@mit.edu<mailto:moses-support-requ...@mit.edu> wrote:
> Date: Wed, 2 Aug 2017 12:13:43 +0530
> From: K Pavan<kosuru.pa...@gmail.com<mailto:kosuru.pa...@gmail.com>>
> Subject: [Moses-support] EMS Incremental training
> To:moses-support@mit.edu<mailto:to%3amoses-supp...@mit.edu>
> Message-ID:
> <CAAQYKL+MJiUoW1Dx_V3_aUuzWh=VV=Q598u6UfavL=je6qk...@mail.gmail.com<mailto:je6qk...@mail.gmail.com>>
> Content-Type: text/plain; charset="utf-8"
> Hi ,
> I was trying to experiment with EMS incremental training. I followed
> http://www.statmt.org/moses/?n=Advanced.Incremental#ntoc1  this
Moses - Advanced/Incremental - 
Introduction. Translation models for Moses are typically batch trained. That 
is, before training you have all the data you wish to use, you compute the 
alignments ...

> documentation to start model with 30k open source corpus parallel sentences
> corpus. But I was unable to add more training instances to this model using
> incremental training. Can  you please help me with steps on working with
> incremental training process. My Idea is to add another 20k parallel
> sentences to update the model. Is there anyway I can do it without
> re-creating model from scratch.
> Thank you,
> Pavan

-------------- next part --------------
An HTML attachment was scrubbed...


Moses-support mailing list

End of Moses-support Digest, Vol 130, Issue 2

Moses-support mailing list

Moses-support mailing list

Reply via email to