On Thu, Oct 30, 2008 at 2:40 PM, bob gailer <[EMAIL PROTECTED]> wrote: > Dinesh B Vadhia wrote: >> >> I need to process a large number (> 20,000) of long and variable length >> lists (> 5,000 elements) ie. >> for element in long_list: >> <do something with element> # the result of this operation is >> not a list >> The performance is reasonable but I wonder if there are faster Python >> methods?
You might try using dictionaries instead. I've had phenomenal speed gains by switching lists to dictionaries before, although that may have had more to do with the fact that I needed to access certain values, rather than iterating through them in sequential order like you're doing. It seems counter-intuitive because a dictionary has a key and a value, and you really only need the key (you can leave all the values blank, or set them to zero or something), but it's a lot faster to find a dictionary element than a list element. It has something to do with the underlying code, but I have been assured that the same thing applies to hashes and arrays in Perl, etc. Shawn _______________________________________________ Tutor maillist - [email protected] http://mail.python.org/mailman/listinfo/tutor
