Thanks Nick.

It processes 10-15 projects(i.e. 10-15 processes are started) at once. One
Zip file size is 2-3 MB.

When I used dual core system it reduced the execution time from 61 seconds
to 55 seconds.

My dual core system Configuration is,
Pentium(R) D CPU 3.00GHz, 2.99GHz
1 GB RAM

Regards,
Gopal



-----Original Message-----
From: Nick Craig-Wood [mailto:n...@craig-wood.com] 
Sent: Thursday, January 08, 2009 3:01 PM
To: python-list@python.org
Subject: Re: Multiprocessing takes higher execution time

Sibtey Mehdi <sibt...@infotechsw.com> wrote:
> I use multiprocessing to compare more then one set of files.
> 
> For comparison each set of files (i.e. Old file1 Vs New file1)
> I create a process,
> 
> Process(target=compare, args=(oldFile, newFile)).start()
> 
> It takes 61 seconds execution time.
> 
> When I do the same comparison without implementing
> multiprocessing, it takes 52 seconds execution time.

> The oldProjects and newProjects will contains zip files
> i.e(oldxyz1.zip,oldxyz2.zip, newxyz2.zip,newxyz2.zip)
> it will unzip both the zip files and compare all the files between old
> and new (mdb files or txt files) and gives the result.
> I do this comparision for n number set of zip files and i assigne each
> set of zip files comparision to a process.

I had a brief look at the code and your use of multiprocessing looks
fine.

How many projects are you processing at once?  And how many MB of zip
files is it?  As reading zip files does lots of disk IO I would guess
it is disk limited rather than anything else, which explains why doing
many at once is actually slower (the disk has to do more seeks).

-- 
Nick Craig-Wood <n...@craig-wood.com> -- http://www.craig-wood.com/nick


--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to