Hello,
I'm trying to decode using a system constructed with the ems. I've
attached a file with
the commandline and trace. Is there a problem with the word penalty?
I'm using
--
Regards,
John J Morgan
echo "محاکمات در مرکز عدلی پروان" |
~/mosesdecoder/moses-chart-cmd/src/moses_chart -v 2 -f
hierarchical/evaluation/trials.filtered.ini.8
Defined parameters (per moses.ini or switch):
config: hierarchical/evaluation/trials.filtered.ini.8
cube-pruning-pop-limit: 1000
input-factors: 0
inputtype: 3
lmodel-file: 0 0 5
/home/john/TF435/d2e/baseline/hierarchical/lm/1.5way.lm.8 0 0 5
/home/john/TF435/d2e/baseline/hierarchical/lm/legal.lm.8 0 0 5
/home/john/TF435/d2e/baseline/hierarchical/lm/medical.lm.8 0 0 5
/home/john/TF435/d2e/baseline/hierarchical/lm/military.lm.8 0 0 5
/home/john/TF435/d2e/baseline/hierarchical/lm/news.lm.8
mapping: 0 T 0 1 T 1
max-chart-span: 20 1000
non-terminals: X
search-algorithm: 3
ttable-file: 2 0 0 5
/home/john/TF435/d2e/baseline/hierarchical/evaluation/filtered.trials.8/phrase-table.0-0.1.1.bin
6 0 0 1 /home/john/TF435/d2e/baseline/hierarchical/model/glue-grammar.8
ttable-limit: 20
verbose: 2
weight-l: 0.1000 0.1000 0.1000 0.1000 0.1000
weight-t: 0.20 0.20 0.20 0.20 0.20 1.0
weight-w: -1
input type is: text input
Loading lexical distortion models...have 0 models
Start loading LanguageModel
/home/john/TF435/d2e/baseline/hierarchical/lm/1.5way.lm.8 : [0.000] seconds
/home/john/TF435/d2e/baseline/hierarchical/lm/1.5way.lm.8: line 17: warning:
non-zero probability for <unk> in closed-vocabulary LM
Start loading LanguageModel
/home/john/TF435/d2e/baseline/hierarchical/lm/legal.lm.8 : [0.000] seconds
/home/john/TF435/d2e/baseline/hierarchical/lm/legal.lm.8: line 393: warning:
non-zero probability for <unk> in closed-vocabulary LM
Start loading LanguageModel
/home/john/TF435/d2e/baseline/hierarchical/lm/medical.lm.8 : [0.000] seconds
/home/john/TF435/d2e/baseline/hierarchical/lm/medical.lm.8: line 331: warning:
non-zero probability for <unk> in closed-vocabulary LM
Start loading LanguageModel
/home/john/TF435/d2e/baseline/hierarchical/lm/military.lm.8 : [1.000] seconds
/home/john/TF435/d2e/baseline/hierarchical/lm/military.lm.8: line 310: warning:
non-zero probability for <unk> in closed-vocabulary LM
Start loading LanguageModel
/home/john/TF435/d2e/baseline/hierarchical/lm/news.lm.8 : [1.000] seconds
/home/john/TF435/d2e/baseline/hierarchical/lm/news.lm.8: line 799: warning:
non-zero probability for <unk> in closed-vocabulary LM
Finished loading LanguageModels : [2.000] seconds
Creating phrase table features
Using uniform ttable-limit of 20 for all translation tables.
Start loading PhraseTable
/home/john/TF435/d2e/baseline/hierarchical/evaluation/filtered.trials.8/phrase-table.0-0.1.1.bin
: [2.000] seconds
filePath:
/home/john/TF435/d2e/baseline/hierarchical/evaluation/filtered.trials.8/phrase-table.0-0.1.1.bin
DecodeFeature: input=FactorMask<0> output=FactorMask<0>
Start loading PhraseTable
/home/john/TF435/d2e/baseline/hierarchical/model/glue-grammar.8 : [2.000]
seconds
filePath: /home/john/TF435/d2e/baseline/hierarchical/model/glue-grammar.8
DecodeFeature: input=FactorMask<0> output=FactorMask<0>
Finished loading phrase tables : [2.000] seconds
DecodeStep():
outputFactors=FactorMask<0>
conflictFactors=FactorMask<>
newOutputFactors=FactorMask<0>
DecodeStep():
outputFactors=FactorMask<0>
conflictFactors=FactorMask<>
newOutputFactors=FactorMask<0>
Adding decoder graph 0 to translation system default
Adding decoder graph 1 to translation system default
Adding language model 0 to translation system default
Adding language model 1 to translation system default
Adding language model 2 to translation system default
Adding language model 3 to translation system default
Adding language model 4 to translation system default
Start loading phrase table from
/home/john/TF435/d2e/baseline/hierarchical/model/glue-grammar.8 : [2.000]
seconds
using New Format phrase tables
Start loading new format pt model : [2.000] seconds
Finished loading phrase tables : [2.000] seconds
IO from STDOUT/STDIN
Created input-output object : [2.000] seconds
The score component vector looks like this:
WordPenalty
!UnknownWordPenalty
LM_5gram
LM_5gram
LM_5gram
LM_5gram
LM_5gram
PhraseModel_1
PhraseModel_2
PhraseModel_3
PhraseModel_4
PhraseModel_5
PhraseModel
The global weight vector looks like this: -1.000 1.000 0.100 0.100 0.100 0.100
0.100 0.200 0.200 0.200 0.200 0.200 1.000
TRANSLATING(0): <s> محاکمات در مرکز عدلی پروان </s> ||| [0,0]=X (1) [0,1]=X
(1) [0,2]=X (1) [0,3]=X (1) [0,4]=X (1) [0,5]=X (1) [0,6]=X (1) [1,1]=X (1)
[1,2]=X (1) [1,3]=X (1) [1,4]=X (1) [1,5]=X (1) [1,6]=X (1) [2,2]=X (1) [2,3]=X
(1) [2,4]=X (1) [2,5]=X (1) [2,6]=X (1) [3,3]=X (1) [3,4]=X (1) [3,5]=X (1)
[3,6]=X (1) [4,4]=X (1) [4,5]=X (1) [4,6]=X (1) [5,5]=X (1) [5,6]=X (1) [6,6]=X
(1)
moses_chart: OnDiskWrapper.cpp:187: UINT64
OnDiskPt::OnDiskWrapper::GetMisc(const std::string&) const: Assertion `iter !=
m_miscInfo.end()' failed.
Aborted
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support