[issue32937] Multiprocessing worker functions not terminating with a large number of processes and a manager

2018-02-26 Thread EricG

EricG <ericg...@gmail.com> added the comment:

Making some further observations, when I set processes = 12, for example, I can 
see 12 separate python processes + 4 additional processes also created which I 
assume are setup for the manager and, perhaps, other purposes.

Now, what makes these 4 additional processes interesting is that for me, 
multiprocessing.cpu_count() returns 24. And, it is when I set processes = 20 
that the python code will sometimes terminate successfully. 20 + 4 = 24...so I 
am using every single cpu in that situation.

However, as noted, when I set processes = 19, it will always terminate 
successfully. 19 + 4 < 24...there is at least one cpu not assigned any work.

Perhaps there some some kind of race condition or swapping around of data 
structures or something that only happens on macOS when every cpu is in use by 
python for this purpose.

--

___
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue32937>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32937] Multiprocessing worker functions not terminating with a large number of processes and a manager

2018-02-26 Thread EricG

EricG <ericg...@gmail.com> added the comment:

If I do:

from queue import Queue

messages = Queue()

No messages are printed. I believe this is expected as a regular Queue cannot 
be shared between processes. It was a problem that the manager was designed to 
solve.

I am using a MacPro running 10.3.2
Python 3.6.4

It would not surprise me if this were an OS specific issue. To reproduce it may 
requiring using a Mac with a high number of cores.

It is trivially reproducible for me.

--

___
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue32937>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32937] Multiprocessing worker functions not terminating with a large number of processes and a manager

2018-02-24 Thread EricG

EricG <ericg...@gmail.com> added the comment:

I do plan to consume the messages on the queue, but only after all worker 
functions are complete...after pool.join() returns. Is this not ok?

I can certainly spawn a thread on the main process which will consume the queue 
entries and insert them into a list or queue which can then be accessed after 
join returns. Is that the correct way this code should be written?

--

___
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue32937>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com