Hello,
I thought about another reason to resize the hash table when it has
too few elements. It's not only a matter of memory usage, it's also a
matter of time usage: iteration over a set or a dict requires going
over all the table. For example, iteration over a set which once had
1,000,000
In a message of Sat, 31 Dec 2005 15:41:50 +1000, Nick Coghlan writes:
Ian Bicking wrote:
Anyway, another even more expedient option would be setting up a
separate bug tracker (something simpler to submit to than SF) and
putting a link on the bottom of every page, maybe like:
[Noam]
For example, iteration over a set which once had
1,000,000 members and now has 2 can take 1,000,000 operations every
time you traverse all the (2) elements.
Do you find that to be a common or plausible use case?
Was Guido's suggestion of s=set(s) unworkable for some reason? dicts
and
Nick Coghlan wrote:
Anyway, another even more expedient option would be setting up a
separate bug tracker (something simpler to submit to than SF) and
putting a link on the bottom of every page, maybe like:
http://trac.python.org/trac/newticket?summary=re:+/path/to/doccomponent=docs
-- heck,
On 12/31/05, Raymond Hettinger [EMAIL PROTECTED] wrote:
[Noam]
For example, iteration over a set which once had
1,000,000 members and now has 2 can take 1,000,000 operations every
time you traverse all the (2) elements.
Do you find that to be a common or plausible use case?
I don't have
Raymond Hettinger wrote:
Was Guido's suggestion of s=set(s) unworkable for some reason? dicts
and sets emphasize fast lookups over fast iteration -- apps requiring
many iterations over a collection may be better off converting to a list
(which has no dummy entries or empty gaps between
Noam Raphael [EMAIL PROTECTED] wrote:
On 12/31/05, Raymond Hettinger [EMAIL PROTECTED] wrote:
[Noam]
For example, iteration over a set which once had
1,000,000 members and now has 2 can take 1,000,000 operations every
time you traverse all the (2) elements.
Do you find that to
[Noam Raphael]
For example, iteration over a set which once had
1,000,000 members and now has 2 can take 1,000,000 operations every
time you traverse all the (2) elements.
[Raymond Hettinger]
Do you find that to be a common or plausible use case?
[Naom]
I don't have a concrete example in
[Noam]
For example, iteration over a set which once had
1,000,000 members and now has 2 can take 1,000,000 operations
every
time you traverse all the (2) elements.
Do you find that to be a common or plausible use case?
I don't have a concrete example in this minute, but a
[Fernando Perez]
Note that this is not a comment on the current discussion per se, but
rather a
small request/idea in the docs department: I think it would be a
really
useful
thing to have a summary page/table indicating the complexities of the
various
operations on all the builtin types,
Patch / Bug Summary
___
Patches : 382 open ( +3) / 3003 closed ( +1) / 3385 total ( +4)
Bugs: 903 open (-11) / 5479 closed (+27) / 6382 total (+16)
RFE : 203 open ( -1) / 195 closed ( +2) / 398 total ( +1)
New / Reopened Patches
__
11 matches
Mail list logo