On Wed, Oct 7, 2020 at 6:24 PM Caleb Donovick <donov...@cs.stanford.edu>
wrote:

> Itertools.count was an example (hence the use of "e.g.") of an iterator
> which can be efficiently
> advanced without producing intermediate state. Clearly anyone can advance
> it manually.
> My point is that an iterator may have an efficient way to calculate its
> state some point in the future
> without needing to calculate the intermediate state.
>

Yes, this is technically an example.  But this doesn't get us any closer to
a real-world use case.  If you want an iterator than counts from N, the
spelling `count(N)` exists now.  If you want to starting counting N
elements later than wherever you are now, I guess do:

new_count = counter(next(old_cound) + N)

For example the fibonacci sequence has a closed
> form formula for the nth element and hence could be advanced efficiently.
>

Sure.  And even more relevantly, if you want the Nth Fibonacci you can
write a closed-form function `nth_fib()` to get it.  This is toy examples
where *theoretically* a new magic method could be used, but it's not close
to a use case that would motivate changing the language.

```
> def get_tcp_headers(stream: Iterator[Byte]):
>     while stream:
>         # Move to the total length field of the IP header
>         stream.advance(2)
>         # record the total length (two bytes)
>         total_length = ...
>         # skip the rest of IP header
>         stream.advance(28)
>         # record the TCP header
>         header = ...
>         yield header
>         stream.advance(total_length - 32 - len(header))
> ```
>

This is BARELY more plausible as a real-world case.  Throwing away 28 bytes
with a bunch of next() calls is completely trivial in time.  A case where
some implementation could conceivably save measurable time would require
skipping 100s of thousands or millions of next() calls... and probably
calls that actually took some work to compute to matter, even there.

What you'd need to motivate the new API is a case where you might skip a
million items in an iterator, and yet the million-and-first item is
computable without computing all the others.  Ideally something where each
of those million calls does something more than just copy a byte from a
kernel buffer.

I don't know that such a use case does not exist, but nothing comes to my
mind, and no one has suggested one in this thread.  Otherwise,
itertools.islice() completely covers the situation already.

-- 
The dead increasingly dominate and strangle both the living and the
not-yet born.  Vampiric capital and undead corporate persons abuse
the lives and control the thoughts of homo faber. Ideas, once born,
become abortifacients against new conceptions.
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/Y72YS2KPEZ5IAO3ES3LWRQN66M6FLTNX/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to