[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread David Decotigny
David Decotigny <[EMAIL PROTECTED]> added the comment: Thank you Jesse. When I read this passage, I thought naively that a timeout raised in a get() would not be harmful: that somehow the whole get() request would be aborted. But now I realize that it would make things rather complicat

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread David Decotigny
David Decotigny <[EMAIL PROTECTED]> added the comment: A quick fix in the user code, when we are sure we don't need the child process if a timeout happens, is to call worker.terminate() in an except Empty clause. ___ Python tracker <[EMAIL PRO

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread David Decotigny
New submission from David Decotigny <[EMAIL PROTECTED]>: With the attached script, then demo() called with for example datasize=40*1024*1024 and timeout=1 will deadlock: the program never terminates. The bug appears on Linux (RHEL4) / intel x86 with "multiprocessing" coming w

[issue3735] allow multiple threads to efficiently send the same requests to a processing.Pool without incurring duplicate processing

2008-08-29 Thread David Decotigny
New submission from David Decotigny <[EMAIL PROTECTED]>: I posted a recipe on ASPN: http://code.activestate.com/recipes/576462/ and Jesse, cheerleader for the inclusion of (multi)processing into python-core, suggested that it could be interesting to add this feature to the next pythons