On 06/08/2017 11:52, Gary Hess wrote:

Hi --


I am following the instructions for Experiment.perl (http://www.statmt.org/moses/?n=FactoredTraining.EMS).


    *In order to run properly, EMS will require: *

      o *A version of Moses along with SRILM, *

But it sounds like that guide is outdated and SRILM is no longer mandatory? Perhaps I could help with updating some docs if there is a need.
cheers, I've updated it

I managed to recompile Moses with bjam but now the Moses decoder is crashing (despite successful compile). Do I need to recompile everything now that I have added SRILM support? I think bjam only recompiled a subset of files.

Here is the error message:

gary@moses-with-baseline-and-binarized-phrase-table-8gb-fra1-01-14965:~/workspace/mosesdecoder$ ~/workspace/mosesdecoder/bin/moses -f ~/workspace/working/binarised-model/moses.ini
Defined parameters (per moses.ini or switch):
config: /home/gary/workspace/working/binarised-model/moses.ini
distortion-limit: 6
feature: UnknownWordPenalty WordPenalty PhrasePenalty PhraseDictionaryCompact name=TranslationModel0 num-features=4 path=/home/gary/workspace/working/binarised-model/phrase-table.minphr input-factor=0 output-factor=0 LexicalReordering name=LexicalReordering0 num-features=6 type=wbe-msd-bidirectional-fe-allff input-factor=0 output-factor=0 path=/home/gary/workspace/working/binarised-model/reordering-table Distortion KENLM name=LM0 factor=0 path=/home/gary/workspace/lm/news-commentary-v8.fr-en.blm.en order=3
input-factors: 0
mapping: 0 T 0
threads: 4
weight: LexicalReordering0= 0.0930209 0.0295306 0.0229615 0.100688 0.0420028 0.0654889 Distortion0= 0.0894002 LM0= 0.0857754 WordPenalty0= -0.131137 PhrasePenalty0= 0.0983649 TranslationModel0= 0.0146617 0.0831309 0.0716015 0.0722352 UnknownWordPenalty0= 1
line=UnknownWordPenalty
FeatureFunction: UnknownWordPenalty0 start: 0 end: 0
line=WordPenalty
FeatureFunction: WordPenalty0 start: 1 end: 1
line=PhrasePenalty
FeatureFunction: PhrasePenalty0 start: 2 end: 2
*line=PhraseDictionaryCompact name=TranslationModel0 num-features=4 path=/home/gary/workspace/working/binarised-model/phrase-table.minphr input-factor=0 output-factor=0* *Exception: moses/FF/Factory.cpp:388 in void Moses::FeatureRegistry::Construct(const string&, const string&) threw UnknownFeatureException because `i == registry_.end()'.*
*Feature name PhraseDictionaryCompact is not registered.*

Any thoughts on what went  wrong?
You have to compile moses with the cmph library to get the compact phrase table.
   http://www.statmt.org/moses/?n=Advanced.RuleTables#ntoc3
Or use the probing phrase-table

Thanks, Gary
Moses - FactoredTraining/EMS - Machine translation <http://www.statmt.org/moses/?n=FactoredTraining.EMS>
www.statmt.org
Introduction. The Experiment Management System (EMS), or Experiment.perl, for lack of a better name, makes it much easier to perform experiments with Moses.






------------------------------------------------------------------------
*From:* Hieu Hoang <hieuho...@gmail.com>
*Sent:* Friday, August 4, 2017 11:33 PM
*To:* Gary Hess
*Cc:* moses-support@mit.edu
*Subject:* Re: [Moses-support] bjam compilation


Hieu Hoang
http://moses-smt.org/
<http://moses-smt.org/>
        
Moses Machine Translation cic – Supporting and enhancing ... <http://moses-smt.org/>
moses-smt.org
The company provides support and enhancements to the Moses Statistical Machine Translation (SMT) toolkit for commercial users. The company has been set up as UK-based ...





On 4 August 2017 at 18:55, Gary Hess <garyhess...@hotmail.com <mailto:garyhess...@hotmail.com>> wrote:

    I need to recompile Moses with SRILM because I want to use the
    EMS. I have already managed to compile SRILM separately.

You don't need SRILM unless you are using it for LM interpolation. For normal LM training and inference, KenLM is better in every way


    Now I have a question about bjam: I see the "-j" option referenced
    (http://www.statmt.org/moses/?n=Development.GetStarted
    <http://www.statmt.org/moses/?n=Development.GetStarted>), e.g.
    "*-j4*", "*-j12*", "*-j8*". What does this option do and which
    value should I use?


This specify how many parallel compilation to perform. Usually, set this to the number of cores on the machine

    I have SRILM in ~/workspace/srilm and Boost is installed in
    ~/workspace/boost_1_60_0. Moses is in ~/workspace/mosesdecoder.


    Thanks!

    Gary

    Moses - Development/GetStarted - Machine translation
    <http://www.statmt.org/moses/?n=Development.GetStarted>
    www.statmt.org <http://www.statmt.org>
    Getting Started with Moses. This section will show you how to
    install and build Moses, and how to use Moses to translate with
    some simple models.




    ------------------------------------------------------------------------
    *From:* moses-support-boun...@mit.edu
    <mailto:moses-support-boun...@mit.edu>
    <moses-support-boun...@mit.edu
    <mailto:moses-support-boun...@mit.edu>> on behalf of
    moses-support-requ...@mit.edu
    <mailto:moses-support-requ...@mit.edu>
    <moses-support-requ...@mit.edu <mailto:moses-support-requ...@mit.edu>>
    *Sent:* Thursday, August 3, 2017 6:00 PM
    *To:* moses-support@mit.edu <mailto:moses-support@mit.edu>
    *Subject:* Moses-support Digest, Vol 130, Issue 2
    Send Moses-support mailing list submissions to
    moses-support@mit.edu <mailto:moses-support@mit.edu>

    To subscribe or unsubscribe via the World Wide Web, visit
    http://mailman.mit.edu/mailman/listinfo/moses-support
    <http://mailman.mit.edu/mailman/listinfo/moses-support>
    or, via email, send a message with subject or body 'help' to
    moses-support-requ...@mit.edu <mailto:moses-support-requ...@mit.edu>

    You can reach the person managing the list at
    moses-support-ow...@mit.edu <mailto:moses-support-ow...@mit.edu>

    When replying, please edit your Subject line so it is more specific
    than "Re: Contents of Moses-support digest..."


    Today's Topics:

       1. Re: EMS Incremental training (Tom Hoar)


    ----------------------------------------------------------------------

    Message: 1
    Date: Wed, 2 Aug 2017 23:37:42 +0700
    From: Tom Hoar <tah...@pttools.net <mailto:tah...@pttools.net>>
    Subject: Re: [Moses-support] EMS Incremental training
    To: moses-support@mit.edu <mailto:moses-support@mit.edu>
    Message-ID: <48fcbac6-27fd-2b7f-6c60-c18b811df...@pttools.net
    <mailto:48fcbac6-27fd-2b7f-6c60-c18b811df...@pttools.net>>
    Content-Type: text/plain; charset="utf-8"

    Pavan, I saved this message from a while back. We follow this advise.

        From: Philipp Koehn
        Subject: Re: [Moses-support] Training vs Incremental Training
        To: Adel Khalifa
        Cc: moses-support@mit.edu <mailto:moses-support@mit.edu>

        Hi,

        there are various versions of incremental training, but full
    re-training
        from scratch will give better results, since incremental word
    alignment
        will not be as good as full word alignment.

        -phi


    On 8/2/2017 11:00 PM, moses-support-requ...@mit.edu
    <mailto:moses-support-requ...@mit.edu> wrote:
    > Date: Wed, 2 Aug 2017 12:13:43 +0530
    > From: K Pavan<kosuru.pa...@gmail.com
    <mailto:kosuru.pa...@gmail.com>>
    > Subject: [Moses-support] EMS Incremental training
    > To:moses-support@mit.edu <mailto:to%3amoses-supp...@mit.edu>
    > Message-ID:
    >
    <CAAQYKL+MJiUoW1Dx_V3_aUuzWh=VV=Q598u6UfavL=je6qk...@mail.gmail.com
    <mailto:je6qk...@mail.gmail.com>>
    > Content-Type: text/plain; charset="utf-8"
    >
    > Hi ,
    >
    > I was trying to experiment with EMS incremental training. I followed
    > http://www.statmt.org/moses/?n=Advanced.Incremental#ntoc1
    <http://www.statmt.org/moses/?n=Advanced.Incremental#ntoc1> this
    Moses - Advanced/Incremental - statmt.org
    <http://www.statmt.org/moses/?n=Advanced.Incremental#ntoc1>
    www.statmt.org <http://www.statmt.org>
    Introduction. Translation models for Moses are typically batch
    trained. That is, before training you have all the data you wish
    to use, you compute the alignments ...



    > documentation to start model with 30k open source corpus
    parallel sentences
    > corpus. But I was unable to add more training instances to this
    model using
    > incremental training. Can  you please help me with steps on
    working with
    > incremental training process. My Idea is to add another 20k parallel
    > sentences to update the model. Is there anyway I can do it without
    > re-creating model from scratch.
    >
    >
    > Thank you,
    > Pavan

    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL:
    
http://mailman.mit.edu/mailman/private/moses-support/attachments/20170802/d2a85020/attachment-0001.html
    
<http://mailman.mit.edu/mailman/private/moses-support/attachments/20170802/d2a85020/attachment-0001.html>

    ------------------------------

    _______________________________________________
    Moses-support mailing list
    Moses-support@mit.edu <mailto:Moses-support@mit.edu>
    http://mailman.mit.edu/mailman/listinfo/moses-support
    <http://mailman.mit.edu/mailman/listinfo/moses-support>


    End of Moses-support Digest, Vol 130, Issue 2
    *********************************************

    _______________________________________________
    Moses-support mailing list
    Moses-support@mit.edu <mailto:Moses-support@mit.edu>
    http://mailman.mit.edu/mailman/listinfo/moses-support
    <http://mailman.mit.edu/mailman/listinfo/moses-support>



--
Hieu Hoang
http://moses-smt.org/

_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to