This is closely related to https://bugs.python.org/issue40230 -
"Itertools.product() Out of Memory Errors."
`itertools.product()` completely consumes all input iterables before yielding
any values, which can cause memory issues in certain (extreme) cases.
I recently needed to partially iterate through the product of two very large
iterators which were themselves power sets of some input values. For unlucky
inputs these iterators were enormous (but not infinite) and
`itertools.product()` used up all the memory on my machine, even though I
sometimes only needed the first few values.
My solution was to write my own version of `itertools.product()` that takes
generators instead of iterables. This allowed me to recreate and loop through
the required iterables multiple times without ever having to store all their
values. Product tuples are produced immediately and the iteration can be
completed for large inputs without runaway memory consumption.
I'm proposing that equivalent functions accepting functions/generators: maybe
`fproduct()`, `fcombinations()` etc, be added to `itertools`.
e.g. simplified `fproduct` for 2 inputs:
```
def fproduct(f1, f2):
for a in f1():
for b in f2():
yield (a, b)
```
Would others find this useful? Are there any drawbacks I'm missing?
Best,
Alastair
_______________________________________________
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at
https://mail.python.org/archives/list/[email protected]/message/RFX7C2YYHIR4SLUSXTPVADM4XJERITLU/
Code of Conduct: http://python.org/psf/codeofconduct/