Use case: I have a Python manager script that monitors several conditions (not 
just time based schedules) that needs to launch Python worker scripts to 
respond to the conditions it detects. Several of these worker scripts may end 
up running in parallel. There are no dependencies between individual worker 
scripts. I'm looking for the pros and cons of using multiprocessing or 
subprocess to launch these worker scripts. Looking for a solution that works 
across Windows and Linux. Open to using a 3rd party library. Hoping to avoid 
the use of yet another system component like Celery if possible and rational.

My understanding (so far) is that the tradeoff of using multiprocessing is that 
my manager script can not exit until all the work processes it starts finish. 
If one of the worker scripts locks up, this could be problematic. Is there a 
way to use multiprocessing where processes are launched independent of the 
parent (manager) process?

Thank you,
Malcolm
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to