Thanks for your reply. Below you can find the required information:
Moses was compiled with following command:
./bjam --with-xmlrpc-c=$HOME/local/ --with-cmph=$HOME/local/
--with-tcmalloc --install-scripts=$HOME/local/moses --enable-boost-pool
-j24
moses.ini:
#########################
### MOSES CONFIG FILE ###
#########################
# input factors
[input-factors]
0
# mapping steps
[mapping]
0 T 0
[distortion-limit]
6
# feature functions
[feature]
UnknownWordPenalty
WordPenalty
PhrasePenalty
PhraseDictionaryBinary name=TranslationModel0 num-features=4
path=/SMT/Service/Models/en2fa/phrase-table input-factor=0 output-factor=0
table-limit=14
LexicalReordering name=LexicalReordering0 num-features=6
type=wbe-msd-bidirectional-fe-allff input-factor=0 output-factor=0
path=/SMT/Service/Models/en2fa/reordering-table
Distortion
KENLM name=LM0 factor=0 path=/SMT/Service/Models/en2fa/Parallel.fa.arpa
order=4
# core weights
[weight]
LexicalReordering0= 0.073298 0.0596145 0.115321 0.066936 0.0283095 0.0565263
Distortion0= 0.0325491
LM0= 0.0620824
WordPenalty0= -0.248997
PhrasePenalty0= 0.0700501
TranslationModel0= 0.00878866 0.0570903 0.0778723 0.042565
UnknownWordPenalty0= 1
====================================================================
I ran mosesserver with the following command:
mosesserver -config /SMT/Service/Models/en2fa/moses-tuned-bin.ini \
--server-port 9393 \
--server-log ./server.log \
-s 100 -b .5
Thanks in advance
On Thu, Sep 25, 2014 at 5:15 PM, Hieu Hoang <[email protected]> wrote:
> what is the exact command you execute? How many threads are you running
> with? what are the exact times for x,y, z? How many sentences were in the
> input?
>
> Can I please have a look at your moses.ini files.
>
>
> On 25/09/14 11:48, Mohammad Mahdi Mahsuli wrote:
>
> Hi,
>
> I was using Moses v0.9 for a long time, and I recently decided to opt to
> use the latest version (v2.1.1). In the test time, when loading the plain
> models into Moses, there is no difficulty. But when I want to use the
> binarized models, the translation time is much more than when I used the
> binary models on the old version of Moses. I have performed the following
> experiments.
>
> model | decoder | time spent
> ------------+----------------+----------------
> old | old | x
> new | new | y>x
> old | new | z>x
> new | old | N/A
>
>
> "Model" means if I have trained the binary model using the old version
> of Moses or the new one. "Decoder" is the Moses version which I have used
> for translating the test sentences.
>
> I should also note that I have cached the binary models in OS memory
> using the cat command. No other memory-intensive process was running on the
> machine. However, I noticed that the translation time when translating a
> sentence for the second time does not reduce. Therefore, I doubt that the
> translations are being cached. Is there any argument I should consider for
> this purpose?
>
> Best Regards,
> M.M. Mahsuli
>
>
> _______________________________________________
> Moses-support mailing
> [email protected]http://mailman.mit.edu/mailman/listinfo/moses-support
>
>
>
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support