My test reduction:

import multiprocessing
import queue

def _process_worker(q):
    while True:
        try:
            something = q.get(block=True, timeout=0.1)
        except queue.Empty:
            return
        else:
            print('Grabbed item from queue:', something)


def _make_some_processes(q):
    processes = []
    for _ in range(10):
        p = multiprocessing.Process(target=_process_worker, args=(q,))
        p.start()
        processes.append(p)
    return processes

def _do(i):
    print('Run:', i)
    q = multiprocessing.Queue()
    for j in range(30):
        q.put(i*30+j)
    processes = _make_some_processes(q)

    while not q.empty():
        pass

#    The deadlock only occurs on Mac OS X and only when these lines
#    are commented out:
#    for p in processes:
#        p.join()

for i in range(100):
    _do(i)

--------------

Output (on Mac OS X using the svn version of py3k):
% ~/bin/python3.2 moprocessmoproblems.py
Run: 0
Grabbed item from queue: 0
Grabbed item from queue: 1
Grabbed item from queue: 2
...
Grabbed item from queue: 29
Run: 1

At this point the script produces no additional output. If I uncomment the lines above then the script produces the expected output. I don't see any docs that would explain this problem and I don't know what the rule would be e.g. you just join every process that uses a queue before the queue is garbage collected.

Any ideas why this is happening?

Cheers,
Brian
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to