New submission from Roberto Martínez:
We face yesterday a bug in a project, and after a few hours of investigation we
found a bad/not documented behavior in multiprocessing.Queue.
If you put one or more items in a queue and if the items are large, there is a
delay between the put is executed and the item is finally available in the
queue. This is reasonable because the underlying thread and the pipe, but the
problem is not this.
The problem is that Queue.qsize() is reporting the number of items put,
Queue.empty() is returning True, and Queue.get() is raising Empty.
So, the only safe method to get all the items is as follows:
while q.qsize():
try:
item = q.get_nowait()
except Empty:
pass
Which is not very nice.
I attach a sample file reproducing the behavior, a single process put 100
elements in a Queue and after that it tries to get all of them, I tested in
python 2.7.9 and 3.4 with the same result (seems python3 is a little bit faster
and you need to enlarge the items). Also if you wait between get's, the process
is able to retrieve all the items.
----------
files: mpqueuegetwrong.py
messages: 237178
nosy: Roberto Martínez
priority: normal
severity: normal
status: open
title: multiprocessing.Queue.get is not getting all the items in the queue
type: behavior
versions: Python 2.7, Python 3.4
Added file: http://bugs.python.org/file38327/mpqueuegetwrong.py
_______________________________________
Python tracker <[email protected]>
<http://bugs.python.org/issue23582>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com