On Tue, Mar 20, 2012 at 8:34 AM, Guido van Rossum <gu...@python.org> wrote:

> Anyway, I also tried to imply that it matters if the number of list
> items would ever be huge. It seems that is indeed possible (even if
> not likely) so I think iterators are useful.

But according to Nick's post, there's some sort of uniquification that
is done, and the algorithm currently used computes the whole list anyway.

I suppose that one could do the uniquification lazily, or find some other
way to avoid that computation.  Is it worth it to optimize an unlikely case?
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to