Tim Peters <[email protected]> added the comment:
If your `bucket` has 30 million items, then
for element in bucket:
executor.submit(kwargs['function']['name'], element, **kwargs)
is going to create 30 million Future objects (and all the under-the-covers
objects needed to manage their concurrency) just as fast as the main thread can
create them. Nothing in your code waits for anything to finish until after
they've _all_ been created and queued up under the covers.
So your producer is running vastly faster than your consumers can keep up with.
It's the huge backlog of pending work items that consume the RAM. To slash
RAM, you need to craft a way to interleave creating new work items with giving
consumers time to deal with them.
----------
nosy: +tim.peters
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue34168>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com