> Ok,
> i discovered that probably we can have a 64gb ram 8/12 cores
> machine.
> The requirements for translation are the same for the
> training?
>
> I prepared two language models in binary format. And i
> noticed that when the server is loading/translating it takes 89/90% of
> ram (actually the test environment has 4gb of RAM), and 10% of cpu.
> But
> when there aren't pending translation the memory used is 0%.
> So for
> translation machine i still need a 8/12 cores, or i can have a
> "smaller" machine?
> For translation what is important? Memory or CPU?
>
> And for example with 64gb ram, approximatively how many models can i
> load on the same machine (suppose we have models with ~400'000/800'000
> sentences)?
>


Hi Ivan

As far as ram is concerned, you need enough to load your model, any more won't 
make much difference, and any less then it will run impossibly slow due to 
swapping.

If your data is processed in batches then you can benefit from having more 
CPUs and running multi-threaded decoding. 

I'm afraid I've no figures mapping training sentences to model size. I'd 
suggest that you run some experiments in your setup.

cheers - Barry

-- 
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.

_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to