Hi Tom

>  The quote comes from http://www.statmt.org/moses/manual/manual.pdf,
>  page 185, section 5.4.4, "Caching across Sentences" paragraph.

This only concerns the implementation of new feature functions, so it doesn't 
apply if you just want to run the decoder unmodified.

>
>  I understood Ivan's problem. My script is only remotely similar. I also
>  frequently use the multi-threading/multiprocessor configuration.
>  However, this new multi-threading module provides non-blocking service
>  of stderr and stdout output on separate threads. It's possible for my
>  queue to piggyback input from multiple text file sources into the
>  stream. So, moses never experiences a break with an EOF. This could
>  conceivably be seen by moses as one huge file with multi-ten's of
>  thousands of lines.

In moses, the master thread will keep loading sentences from the input until 
it reaches an EOF. As it loads the sentences, it places them into a queue, 
and worker threads pull the sentences of this queue to translate them. 
Translation finishes once the last worker thread exists. This means that if 
you give moses a source file, then it loads the entire file into memory 
before translation has started. This is a somewhat naive strategy, but it is 
assumed that source files will always be much smaller than models. 


>
>  It'll take me about a week to get to this. I'll let you know. By the
>  way, I use the Dash redirection for evaluation. During Evaluation and
>  mert the "top" utility show peak CPU efficiency at "only" 385% on a
>  quad-core. Is this also your experience with multi-threading on
>  multiprocessors?
>

Yes, if I run with n threads top usually shows CPU at close to, but short of,  
nx100%, I don't know why there's a gap.

cheers - Barry

-- 
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.

_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to