hi samira

how recent is the moses source code you're using? Can you please 'git pull'
and see it it's still a problem.


On 14 August 2013 13:15, samira tofighi <[email protected]> wrote:

> Problem is exist still and the decode file output is attached.
>
> thanks for your answers.
>
>
>
> On Wed, Aug 14, 2013 at 3:43 PM, Hieu Hoang <[email protected]> wrote:
>
>> hi samira
>>
>> can you try using this ini file instead of your version. The ini file
>> format has recently changed. It should be able to read your ini file, but
>> there might have been a bug in the conversion from old to new format.
>>
>> Please tell me if it works
>>
>>
>> On 14 August 2013 11:51, samira tofighi <[email protected]> wrote:
>>
>>> Hi,
>>> with this configuration file:
>>>
>>>
>>> # MERT optimized configuration
>>> # decoder /home/s.tofighi/mosesdecoder/bin/moses
>>> # BLEU 0.261756 on dev
>>> /Share/local/s.tofighi/program_moses/bridge_defa/fa_en/corpora/dev/corpus0
>>> # We were before running iteration 6
>>> # finished Tue Apr  9 17:28:22 IRDT 2013
>>> ### MOSES CONFIG FILE ###
>>> #########################
>>>
>>> # input factors
>>> [input-factors]
>>> 0
>>>
>>> # mapping steps
>>> [mapping]
>>> 0 T 0
>>> 1 T 1
>>>
>>> # translation tables: table type (hierarchical(0), textual (0), binary
>>> (1)), source-factors, target-factors, number of scores, file
>>> # OLD FORMAT is still handled for back-compatibility
>>> # OLD FORMAT translation tables: source-factors, target-factors, number
>>> of scores, file
>>> # OLD FORMAT a binary table type (1) is assumed
>>> [ttable-file]
>>> 0 0 0 5
>>> /home/samira/myprogs/moses_progs/cnn_28000/working2/train/model/phrase-table.gz
>>> 0 0 0 5
>>> /home/samira/myprogs/snover_sst/sys4/working/train/model/phrase-table.gz
>>>
>>> # no generation models, no generation-file section
>>>
>>> # language models: type(srilm/irstlm), factors, order, file
>>> [lmodel-file]
>>> 8 0 3 /home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm.en
>>> 8 0 3 /home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm2.en
>>>
>>>
>>> # limit on how many phrase translations e for each phrase f are loaded
>>> # 0 = all elements loaded
>>> [ttable-limit]
>>> 40
>>>
>>> # distortion (reordering) files
>>> [distortion-file]
>>> 0-0 wbe-msd-bidirectional-fe-allff 6
>>> /home/samira/myprogs/moses_progs/cnn_28000/working2/train/model/reordering-table.wbe-msd-bidirectional-fe.gz
>>>
>>> # distortion (reordering) weight
>>> [weight-d]
>>> 0.0153922
>>> 0.072167
>>> 0.0335072
>>> 0.0432684
>>> 0.0614601
>>> 0.0181157
>>> 0.0878144
>>>
>>> # language model weights
>>> [weight-l]
>>> 0.1022989
>>> 0.1222989
>>>
>>> # translation model weights
>>> [weight-t]
>>> 0.000153608
>>> 0.122284
>>> 0.057549
>>> 0.0621673
>>> 0.0209181
>>> 0.00153608
>>> 0.122284
>>> 0.057549
>>> 0.0621673
>>> 0.0209181
>>>
>>> # no generation models, no weight-generation section
>>>
>>> # word penalty
>>> [weight-w]
>>> -0.332904
>>>
>>> [distortion-limit]
>>> 6
>>>
>>> [decoding-graph-backoff]
>>>  0
>>>  1
>>>
>>> I got these STDERR output file:
>>>
>>> nohup: ignoring input
>>> Defined parameters (per moses.ini or switch):
>>>     config: working2/train/model/moses.ini
>>>     decoding-graph-backoff: 0 1
>>>     distortion-file: 0-0 wbe-msd-bidirectional-fe-allff 6
>>> /home/samira/myprogs/moses_progs/cnn_28000/working2/train/model/reordering-table.wbe-msd-bidirectional-fe.gz
>>>
>>>      distortion-limit: 5
>>>     drop-unknown:
>>>     input-factors: 0
>>>     input-file: corpora/test/test.fa
>>>     lmodel-file: 8 0 3
>>> /home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm.en 8 0 3
>>> /home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm2.en
>>>      mapping: 0 T 0 1 T 1
>>>     translation-details: working2/o
>>>     ttable-file: 0 0 0 5
>>> /home/samira/myprogs/moses_progs/cnn_28000/working2/train/model/phrase-table.gz
>>> 0 0 0 5
>>> /home/samira/myprogs/snover_sst/sys4/working/train/model/phrase-table.gz
>>>      ttable-limit: 6s
>>>     weight-d: 0.0153922 0.072167 0.0335072 0.0432684 0.0614601 0.0181157
>>> 0.0878144
>>>     weight-l: 0.1022989 0.1222989
>>>     weight-t: 0.000153608 0.122284 0.057549 0.0621673 0.0209181
>>> 0.00153608 0.122284 0.057549 0.0621673 0.0209181
>>>      weight-w: -0.332904
>>> /home/samira/mosesdecoder/bin
>>> line=KENLM factor=0 order=3 lazyken=0
>>> path=/home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm.en
>>> FeatureFunction: KENLM0 start: 0 end: 1
>>> WEIGHT KENLM0=0.122,
>>> line=KENLM factor=0 order=3 lazyken=0
>>> path=/home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm2.en
>>> FeatureFunction: KENLM1 start: 1 end: 2
>>> WEIGHT KENLM1=
>>> Check scores.size() == indexes.second - indexes.first failed in
>>> moses/ScoreComponentCollection.h:235
>>>
>>>
>>> On Wed, Aug 14, 2013 at 1:27 AM, Philipp Koehn <[email protected]>wrote:
>>>
>>>> Hi,
>>>>
>>>> this should work, and I do not see anything obviously wrong with
>>>> your configuration file.
>>>>
>>>> Is it possible that you run out of memory
>>>>
>>>> Can you post your STDERR output file
>>>> testOut/translate_snoverSst1.decode.out ?
>>>>
>>>> -phi
>>>>
>>>> On Tue, Aug 13, 2013 at 9:43 PM, samira tofighi <
>>>> [email protected]> wrote:
>>>> > i want to use two language models in my system but i get these errors:
>>>> >
>>>> > ./test: line 10:  4951 Aborted                 (core dumped) nohup
>>>> nice
>>>> > ~/mosesdecoder/bin/moses -dl 5 -ttl 6s -config
>>>> > working2/train/model/moses.ini -drop-unknown -translation-details
>>>> working2/o
>>>> > -input-file corpora/test/test.fa >
>>>> testOut/translates_snoverSSt1.output 2>
>>>> > testOut/translate_snoverSst1.decode.out
>>>> >
>>>> > any idea to solve it?
>>>> > my moses.ini file is like below:
>>>> >
>>>> >
>>>> > # MERT optimized configuration
>>>> > # decoder /home/s.tofighi/mosesdecoder/bin/moses
>>>> > # BLEU 0.261756 on dev
>>>> >
>>>> /Share/local/s.tofighi/program_moses/bridge_defa/fa_en/corpora/dev/corpus0
>>>> > # We were before running iteration 6
>>>> > # finished Tue Apr  9 17:28:22 IRDT 2013
>>>> > ### MOSES CONFIG FILE ###
>>>> > #########################
>>>> >
>>>> > # input factors
>>>> > [input-factors]
>>>> > 0
>>>> >
>>>> > # mapping steps
>>>> > [mapping]
>>>> > 0 T 0
>>>> > 1 T 1
>>>> >
>>>> > # translation tables: table type (hierarchical(0), textual (0),
>>>> binary (1)),
>>>> > source-factors, target-factors, number of scores, file
>>>> > # OLD FORMAT is still handled for back-compatibility
>>>> > # OLD FORMAT translation tables: source-factors, target-factors,
>>>> number of
>>>> > scores, file
>>>> > # OLD FORMAT a binary table type (1) is assumed
>>>> > [ttable-file]
>>>> > 0 0 0 5
>>>> >
>>>> /home/samira/myprogs/moses_progs/cnn_28000/working2/train/model/phrase-table.gz
>>>> > 0 0 0 5
>>>> >
>>>> /home/samira/myprogs/snover_sst/sys4/working/train/model/phrase-table.gz
>>>> >
>>>> > # no generation models, no generation-file section
>>>> >
>>>> > # language models: type(srilm/irstlm), factors, order, file
>>>> > [lmodel-file]
>>>> > 0 0 3 /home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm.en
>>>> > 0 0 3
>>>> /home/samira/myprogs/moses_progs/cnn_28000/lm2/cnn5_28000.blm2.en
>>>> >
>>>> >
>>>> > # limit on how many phrase translations e for each phrase f are loaded
>>>> > # 0 = all elements loaded
>>>> > [ttable-limit]
>>>> > 40
>>>> >
>>>> > # distortion (reordering) files
>>>> > [distortion-file]
>>>> > 0-0 wbe-msd-bidirectional-fe-allff 6
>>>> >
>>>> /home/samira/myprogs/moses_progs/cnn_28000/working2/train/model/reordering-table.wbe-msd-bidirectional-fe.gz
>>>> >
>>>> > # distortion (reordering) weight
>>>> > [weight-d]
>>>> > 0.0153922
>>>> > 0.072167
>>>> > 0.0335072
>>>> > 0.0432684
>>>> > 0.0614601
>>>> > 0.0181157
>>>> > 0.0878144
>>>> >
>>>> > # language model weights
>>>> > [weight-l]
>>>> > 0.1022989
>>>> > 0.1222989
>>>> >
>>>> > # translation model weights
>>>> > [weight-t]
>>>> > 0.000153608
>>>> > 0.122284
>>>> > 0.057549
>>>> > 0.0621673
>>>> > 0.0209181
>>>> > 0.00153608
>>>> > 0.122284
>>>> > 0.057549
>>>> > 0.0621673
>>>> > 0.0209181
>>>> >
>>>> > # no generation models, no weight-generation section
>>>> >
>>>> > # word penalty
>>>> > [weight-w]
>>>> > -0.332904
>>>> >
>>>> > [distortion-limit]
>>>> > 6
>>>> >
>>>> > [decoding-graph-backoff]
>>>> >  0
>>>> >  1
>>>> >
>>>> > _______________________________________________
>>>> > Moses-support mailing list
>>>> > [email protected]
>>>> > http://mailman.mit.edu/mailman/listinfo/moses-support
>>>> >
>>>>
>>>
>>>
>>> _______________________________________________
>>> Moses-support mailing list
>>> [email protected]
>>> http://mailman.mit.edu/mailman/listinfo/moses-support
>>>
>>>
>>
>>
>> --
>> Hieu Hoang
>> Research Associate
>> University of Edinburgh
>> http://www.hoang.co.uk/hieu
>>
>>
>


-- 
Hieu Hoang
Research Associate
University of Edinburgh
http://www.hoang.co.uk/hieu
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to