Hi Kenneth,

Really sorry please ignore my last question. Was just my stupid mistake -
the binary version of the rule table was just missing and I didn't
realise. Apologies again.

Yvette




> Hi,
>
> The phrase-based decoder now works for me, but running the hierarchical
> decoder gets the following error now with a set-up that worked with the
> old moses download:
>
> Check m_fileSource.is_open() failed in OnDiskPt/OnDiskWrapper.cpp:59
>
> Thanks again!
>
> Here's more of the output:
>
> Defined parameters (per moses.ini or switch):
>       config: mert-work/moses-tuned.ini
>       cube-pruning-pop-limit: 1000
>       input-factors: 0
>       inputtype: 3
>       lmodel-file: 0 0 5 /home/ygraham/ILT1/wmt2011/en-de/v1/lm/small.de.srilm
>       mapping: 0 T 0 1 T 1
>       max-chart-span: 20 1000
>       non-terminals: X
>       search-algorithm: 3
>       translation-details: evaluation/newstest2010/test.en.details
>       ttable-file: 2 0 0 5
> /home/ygraham/ILT1/wmt2011/en-de/v3-non-proj-small/model/rules.bin 6 0 0
> 1 /home/ygraham/ILT1/wmt2011/en-de/v3-non-proj-small/model/glue-grammar
>       ttable-limit: 20
>       weight-l: 0.129163
>       weight-t: 0.0414987 0.0204027 0.0541922 0.036586 -0.205204 0.173271
>       weight-w: -0.339682
> /home/ygraham/tools/mosesdecoder/bin
> Loading lexical distortion models...have 0 models
> Start loading LanguageModel
> /home/ygraham/ILT1/wmt2011/en-de/v1/lm/small.de.srilm : [0.000] seconds
> /home/ygraham/ILT1/wmt2011/en-de/v1/lm/small.de.srilm: line 13540:
> warning: non-zero probability for <unk> in closed-vocabulary LM
> Finished loading LanguageModels : [17.000] seconds
> Using uniform ttable-limit of 20 for all translation tables.
> Start loading PhraseTable
> /home/ygraham/ILT1/wmt2011/en-de/v3-non-proj-small/model/rules.bin :
> [17.000] seconds
> filePath:
> /home/ygraham/ILT1/wmt2011/en-de/v3-non-proj-small/model/rules.bin
> Start loading PhraseTable
> /home/ygraham/ILT1/wmt2011/en-de/v3-non-proj-small/model/glue-grammar :
> [17.000] seconds
> filePath:
> /home/ygraham/ILT1/wmt2011/en-de/v3-non-proj-small/model/glue-grammar
> Finished loading phrase tables : [17.000] seconds
> Start loading phrase table from
> /home/ygraham/ILT1/wmt2011/en-de/v3-non-proj-small/model/glue-grammar :
> [17.000] seconds
> Start loading new format pt model : [17.000] seconds
> Finished loading phrase tables : [17.000] seconds
> IO from STDOUT/STDIN
> Created input-output object : [17.000] seconds
> Check m_fileSource.is_open() failed in OnDiskPt/OnDiskWrapper.cpp:59
>
>
>> Hi,
>>
>>      Hieu's mail prompted me to check for other errors.  I've fixed the
>> memset issue and theoretically fixed the one about
>>
>> /usr/include/boost/test/test_tools.hpp:496: error: no match for call to
>> ‘(boost::test_tools::check_is_close_t) (const int&, const float&, const
>> boost::test_tools::percent_tolerance_t<doub
>> le>&)’
>>
>> by casting every argument to float first.
>>
>> Kenneth
>>
>> On 08/06/2012 01:30 AM, [email protected] wrote:
>>> Hi Kenneth,
>>>
>>> No, there are still some of errors. I've attached the output again.
>>>
>>> Thanks again,
>>> Yvette
>>>
>>>
>>>> Hi Yvette,
>>>>
>>>>    Think I found the bug, which was in the test.  If you git pull, does
>>>> it
>>>> compile fine for you now?
>>>>
>>>> Sorry,
>>>>
>>>> Kenneth
>>>>
>>>> On 08/03/12 01:51, [email protected] wrote:
>>>>> Hi Kenneth,
>>>>>
>>>>> Both the following commands are in the list of processes for top:
>>>>>
>>>>> util/bin/gcc-4.3/release/debug-symbols-on/link-static/probing_hash_table_test
>>>>>
>>>>> /bin/sh -c ?
>>>>> LD_LIBRARY_PATH="/usr/bin:/usr/lib:/usr/lib32:usr/lib64:$LD_LIBRARY_PATH"?export
>>>>> LD_LIBRARY_PATH??
>>>>>
>>>>> Not sure what to try next.
>>>>>
>>>>> Thanks,
>>>>> Yvette
>>>>> _______________________________________________
>>>>> Moses-support mailing list
>>>>> [email protected]
>>>>> http://mailman.mit.edu/mailman/listinfo/moses-support
>>>>
>>
>

_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to