Raymond Hettinger wrote:

> Was Guido's suggestion of s=set(s) unworkable for some reason?  dicts
> and sets emphasize fast lookups over fast iteration -- apps requiring
> many iterations over a collection may be better off converting to a list
> (which has no dummy entries or empty gaps between entries).
> 
> Would the case be improved by incurring the time cost of 999,998 tests
> for possible resizing (one for each pop) and some non-trivial number of
> resize operations along the way (each requiring a full-iteration over
> the then current size)?

Note that this is not a comment on the current discussion per se, but rather a
small request/idea in the docs department: I think it would be a really useful
thing to have a summary page/table indicating the complexities of the various
operations on all the builtin types, including at least _mention_ of subtleties
and semi-gotchas.

Python is growing in popularity, and it is being used for more and more
demanding tasks all the time.  Such a 'complexity overview' of the language's
performance would, I think, be very valuable to many.   I know that much of
this information is available, but I'm talking about a specific summary, which
also discusses things like Noam's issue.  

For example, I had never realized that on dicts, for some O(N) operations, N
would mean "largest N in the dict's history" instead of "current number of
elements".  While I'm not arguing for any changes, I think it's good to _know_
this, so I can plan for it if I am ever in a situation where it may be a
problem.

Just my 1e-2.

And Happy New Year to the python-dev team, with many thanks for all your
fantastic work on making the most pleasant, useful programming language out
there.

Cheers,

f

_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to