>>>>> "DB" == Duane Bronson <[email protected]> writes:
DB> Of course, the real winner is just to append to the string instead DB> of the beginning, but sometimes that's just not feasible. The DB> strange (i.e., cool but unexpected) thing about this is that after DB> appending, it optimizes the variables so everything works faster. DB> I had to rewrite the benchmark tests to use a random string each DB> iteration otherwise the numbers were thrown off. I illustrate DB> this in my two timings at the bottom which should be identical, DB> but the wtf() function throws them off balance. that is likely because of how perl allocates ram. when a buffer (string, array or hash buckets) gets too filled up, it reallocates it with twice the size and copies over the previous data. this reallocation is costly and gets triggered when you can't easily control it. but once it happens, additions after that are faster until you again run out of space. this is why i said string sizes (both original and prepended) matter a great deal when benchmarking this kind of thing. uri -- Uri Guttman ------ [email protected] -------- http://www.sysarch.com -- ----- Perl Code Review , Architecture, Development, Training, Support ------ --------- Gourmet Hot Cocoa Mix ---- http://bestfriendscocoa.com --------- _______________________________________________ Boston-pm mailing list [email protected] http://mail.pm.org/mailman/listinfo/boston-pm

