On Nov 5, 2012, at 8:44 AM, Dave Angel <[email protected]> wrote:

> On 11/05/2012 06:53 AM, Bala subramanian wrote:
>>> 

[Huge byte]


>>> Thanks in advance,
>>> Bala
>> 
>> 
> 
> Before you spend too much energy on this, I'd suggest that you'll
> probably see a substantial slowdown trying to write the two files in
> parallel.  Unless the calculations are extensive that actually format
> the data for writing.
> 
> On the other hand, if the calculations dominate the problem, then you
> probably want to do multiprocessing to get them to happen in parallel. 
> See the recent thread "using multiprocessing efficiently to process
> large data file"
> 
> Just be sure and do some measuring before spending substantial energy
> optimizing.
> 
> -- 
> 
> DaveA
> 

Assuming, after you take Dave's advice, that you still want to try parallel 
processing.  Take a quick look at:

         
http://docs.python.org/2/library/multiprocessing.html?highlight=multiprocessing#multiprocessing

and in particular at section 16.6.1.5 on using a pool of workers.  This might 
provide a simple clean way for you to hand off the work.

-Bill

_______________________________________________
Tutor maillist  -  [email protected]
To unsubscribe or change subscription options:
http://mail.python.org/mailman/listinfo/tutor

Reply via email to