Tim Peters added the comment:

Nice to see you, Jurjen!  Been a long time :-)

I'd like to see changes here too.  It's unclear what "a lazy version"  is 
intended to mean, exactly, but I agree the actual behavior is surprising, and 
that mpool.py is a lot less surprising in several ways.

I got bitten by this just last week, when running a parallelized search over a 
massive space _expected_ to succeed after exploring a tiny fraction of the 
search space.  Ran out of system resources because imap_unordered() tried to 
queue up countless millions of work descriptions.  I had hoped/expected that it 
would interleave generating and queue'ing "a few" inputs with retrieving 
outputs, much as mpool.py behaves.

In that case I switched to using apply_async() instead, interposing my own 
bounded queue (a collections.deque used only in the main program) to throttle 
the main program.  I'm still surprised it was necessary ;-)

----------
nosy: +tim.peters

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue19993>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to