Yes massimo
Here is my current implementation , please note i use it outside of
web2py and call it via subprocess . And it is quite crude and dirty ,
need to modify for using outside of my script.
What it does : it accepts a list of files (file_list) and the function
object (toprocess) , and according to sweetspot i found in the
maching (best number of process to run) i modify number of splits to
distribute workload across.
I have not tried inside web2py as one wierdness of multiprocessing requires
if __name__="__main__"
def task_splitter( files_list ,toprocess ):
from multiprocessing import Process, Queue
if len(files_list)>=16*16:
splits = 16
elif len(files_list)>=64:
splits =8
elif len(files_list)>=16:
splits =4
elif len(files_list)<16:
splits=1
n = len( files_list ) / splits
processes = []
ql = []
for i in range( 0, splits ):
que = Queue()
if i == splits - 1:
# len( files_list )
processes.append( [Process( target = toprocess, args = (
files_list, que) ), que] )
else:
# len( files_list )
processes.append( [Process( target = toprocess, args = (
files_list[0:n], que ) ), que] )
files_list = files_list[n:]
processes[i][0].start()
for p in processes:
ql.append( p[1].get() )
p[0].join()
flat_q = []
for q in ql:
flat_q += q
# "End , Returning"
return flat_q
On 11/19/10, mdipierro <[email protected]> wrote:
> Do you have an example...?
>
>
>
> On Nov 18, 9:16 pm, Phyo Arkar <[email protected]> wrote:
>> i use Multiprocessing's Queue across processess which works to parse
>> huge list of files and communicate back with database. they work
>> great.
>>
>> On 11/19/10, Pystar <[email protected]> wrote:
>>
>> > I would like to know if any one here has used any message queue? and
>> > which one comes recommended?
>>
>>