[issue30549] ProcessPoolExecutor hangs forever if the object raises on __getstate__
New submission from Roberto Martínez: Hi, I detected that a ProcessPoolExecutor hangs if the object fails to picklelize. I attached the simplest code to reproduce the behavior. Note that the interpreter should exit after the exception but it doesn't and hangs forever. I tested with python 3.4, 3.5 and 3.6 with the same results. -- files: test_noshutdown.py messages: 294999 nosy: Roberto Martínez priority: normal severity: normal status: open title: ProcessPoolExecutor hangs forever if the object raises on __getstate__ type: behavior versions: Python 3.4, Python 3.5, Python 3.6 Added file: http://bugs.python.org/file46919/test_noshutdown.py ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue30549> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue23582] multiprocessing.Queue.get is not getting all the items in the queue
Roberto Martínez added the comment: I think you misunderstood my explanation. My english is not very good, sorry. I think that my previously attached file (mpqueuegetwrong.py) shows it better than my words. In this file you can see that I am not calling qsize nor get_nowait, only put and get. And the behavior I am reporting is only related to those methods (and I think it is not documented). -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue23582 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue23582] multiprocessing.Queue.get is not getting all the items in the queue
Roberto Martínez added the comment: That's not my point. To my understanding, when you put a item in the queue the item *must* be available to get in the next call. So I think put should block until the item is really in the queue so when you call get it return *always* and not some times (depending on the item size). If this is not the expected behavior I think it should be warned in the put/get method. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue23582 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue23582] multiprocessing.Queue.get is not getting all the items in the queue
New submission from Roberto Martínez: We face yesterday a bug in a project, and after a few hours of investigation we found a bad/not documented behavior in multiprocessing.Queue. If you put one or more items in a queue and if the items are large, there is a delay between the put is executed and the item is finally available in the queue. This is reasonable because the underlying thread and the pipe, but the problem is not this. The problem is that Queue.qsize() is reporting the number of items put, Queue.empty() is returning True, and Queue.get() is raising Empty. So, the only safe method to get all the items is as follows: while q.qsize(): try: item = q.get_nowait() except Empty: pass Which is not very nice. I attach a sample file reproducing the behavior, a single process put 100 elements in a Queue and after that it tries to get all of them, I tested in python 2.7.9 and 3.4 with the same result (seems python3 is a little bit faster and you need to enlarge the items). Also if you wait between get's, the process is able to retrieve all the items. -- files: mpqueuegetwrong.py messages: 237178 nosy: Roberto Martínez priority: normal severity: normal status: open title: multiprocessing.Queue.get is not getting all the items in the queue type: behavior versions: Python 2.7, Python 3.4 Added file: http://bugs.python.org/file38327/mpqueuegetwrong.py ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue23582 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com