Hello, 

I am trying to implement the multiprocessing in my application to take
advantage of multiple cores. I have created two

Separate process something like this.

que = Queue
Process(target = getData, args=(que , section, MdbFile,)).start()
Process(target = getData, args=(que , section, MdbFile,)).start()

In getData function I create the object(max 7MB size) and add in to queue
(que.put (object)).

After that I fetch the object using que.get () and use in my application.
but it takes more time to get the data.

Any one can help me out this problem.

Thanks,
Sibtey

My code<<<<<
from multiprocessing import Process, Queue

def getData(queue, section, mdbFile):
    """
    This function returns the gapappdata for the given mdb file.
    """
    app  = MdbFile(mdbFile)
    mdbData = app.data#it is a heavy object
    queue.put((section,mdbData))

def getData2(mdbFile):
    """
    This function returns the gapappdata for the given mdb file.
    """
    app  = MdbFile(mdbFile)
    mdbData = app.data#it is a heavy object
    return mdbData

def test_multipleProcess(fromMdbFile, toMdbFile):
    #multipleProcess
    t1 = time.time()
    queue = Queue()
    sections = ['From', 'To']
    Process(target= getData_1, args=(queue, 'From',fromMdbFile,)).start()
    Process(target= getData_1, args=(queue, 'To',toMdbFile,)).start()
    section, gapAppData = queue.get()
    section, gapAppData = queue.get()
    t2 = time.time()
    print "total time using multiProcessing:",t2-t1
    d1 = getData2(fromMdbFile)
    d2 = getData2(toMdbFile)
    print "total time withought multiProcessing:", time.time()-t2





if __name__=='__main__':
    f1 =r" a.mdb"
    f2 =r"b.mdb"
    test_multipleProcess(f1,f2)

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to