At 9:25 PM +0200 10/21/02, allan wrote:
i do really need to store those values somehow, because eventually
these need to be sorted for their final html-output

all this is behaving very well and fast on sites with few links (less
than 2000), but when i run the script on a large site (more than 8000
unique links) i can see the memory is increasing dramtically, so much
that i can't even stop the program normally. [a couple of control-c's
will do]

i am not sure whether the problem is in the hash-storing itself or the
post-sorting. but if i monitor the memory comsumption while pulling
the links/urls i can clearly see it does increase steadily
One thing you can do is actually check to see how much memory's getting used by the various data structures, which may give you an idea of where the problem lies. You can use the total_size function in the Devel::Size module to see how big a hash is (with all its component parts). It might be that you're storing more data than you think you are.

It's also possible you're tripping over a bug here or there. Are you using any large temporary data structures inside closures? There are some leaks associated with that sort of thing.
--
Dan

--------------------------------------"it's like this"-------------------
Dan Sugalski even samurai
[EMAIL PROTECTED] have teddy bears and even
teddy bears get drunk

Reply via email to