Amaury Forgeot d'Arc <[EMAIL PROTECTED]> added the comment: The slowdown is because of the garbage collector, which has more and more objects to traverse (the tuples). If I add "import gc; gc.disable()" at the beginning of your script, it runs much faster, and the timings look linear.
Martin's sample is not affected, because there are very few deallocations, and the gc collection is not triggered. Disabling the gc may not be a good idea in a real application; I suggest you to play with the gc.set_threshold function and set larger values, at least while building the dictionary. (700, 1000, 10) seems to yield good results. ---------- nosy: +amaury.forgeotdarc __________________________________ Tracker <[EMAIL PROTECTED]> <http://bugs.python.org/issue2607> __________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com