On Jan 18, 10:00 pm, James Mills prolo...@shortcircuit.net.au
wrote:
On Mon, Jan 19, 2009 at 3:50 PM, gopal mishra gop...@infotechsw.com wrote:
i know this is not an io - bound problem, i am creating heavy objects in the
process and add these objects in to queue and get that object in my main
On Jan 19, 3:09 am, Carl Banks pavlovevide...@gmail.com wrote:
snip
Since multiprocessing serializes and deserializes the data while
passing
it from process to process, passing very large objects would have a
very
high latency and overhead. IOW, gopal's diagnosis is correct. It's
just not
]
Sent: Saturday, January 17, 2009 10:37 AM
To: gopal mishra
Cc: python-list@python.org
Subject: Re: problem in implementing multiprocessing
On Fri, Jan 16, 2009 at 7:16 PM, gopal mishra gop...@infotechsw.com wrote:
I create two heavy objects sequentially without using multipleProcessing
On Mon, Jan 19, 2009 at 3:50 PM, gopal mishra gop...@infotechsw.com wrote:
i know this is not an io - bound problem, i am creating heavy objects in the
process and add these objects in to queue and get that object in my main
program using queue.
you can test the this sample code
import time
I create two heavy objects sequentially without using multipleProcessing
then creation of the objects takes 2.5 sec.if i create these two objects in
separate process then total time is 6.4 sec.
i am thinking it is happening due to the pickling and unpickling of the
objects.if i am right then what
On Fri, Jan 16, 2009 at 7:16 PM, gopal mishra gop...@infotechsw.com wrote:
I create two heavy objects sequentially without using multipleProcessing
then creation of the objects takes 2.5 sec.if i create these two objects in
separate process then total time is 6.4 sec.
i am thinking it is
gopal mishra wrote:
Hello,
I am trying to implement the multiprocessing in my application to take
advantage of multiple cores. I have created two
Separate process something like this.
que = Queue
Process(target = getData, args=(que , section, MdbFile,)).start()
Process(target = getData,