drieux wrote: > > Malloc from the Top > Free From the Bottom > > and whip out some sort of 'initializer' function that > will if the thingiePoo is non-Null start out by calling > the tree_traverser() that will decend the complex data > structure making sure that the leaf nodes can be tossed safely > till one gets to the top of the tree and it all be gone bye-bye! > At which point the initializer() will start hanging good stuff. > { some cultists use the terms 'constructors' and 'destructors' > find the language you feel safest with }
I ahd a few chilling moments just now when I thought that all of this was necessary. I even created a function to do this recursive "say bye-bye to each individual leafy-poo" routine: delete_kids($hash_ref); sub delete_kids { my $hash_ref = shift; foreach (keys %$hash_ref) { delete_kids($hash_ref->{$_}); delete $hash_ref->{$_}; } } ... because I saw in my Task Manager that Perl was still holding onto the memory it had acquired for a large multi-dimensional hash [actually, I kept it to three dimensions, since I was using base 26 [a..z].] Then I got this wierd inkling that m-maybe this wouldn't be necessary after all. Maybe the system simply wasn't so churlish, or resource-starved, as to need to reclaim memory from a program that had needed it in the past, and might very well need it again. I think that is what happended here: Greetings! C:\>perl -w my $string = ''; my $hash_ref = {}; make_strings($hash_ref, $string); sub make_strings { my ($hash_ref, $start_string) = @_; return if length($start_string) > 3; for ('a'..'z') { my $string = $start_string . $_; $hash_ref->{$string} = {}; make_strings($hash_ref->{$string}, $string); } } $hash_ref = {}; while (my $input = <STDIN>) { last if $input eq "\n"; $hash_ref->{$input} = 1; print $input; } make_strings($hash_ref, $string); while (my $input = <STDIN>) { last if $input eq "\n"; $hash_ref->{$input} = 1; print $input; } ^Z Now I am in the pause mode. Memory usage stands at 73 MB, with no reduction in memory use. [echoed} I will now restart the memory-grabbing process ... # [Here I pressed the eneter key alone, to break out of the first loop] It took only a few seconds this time for the algorithm to comlete. ... Memory usage for Perl has climbed only slightly, to 73, 992 KB from about 73, 20 0. ... The tree has been completely reconstructed, with minimal further demands on the system. ... This is an optimization, I am sure, of the NT memory manager. This is an optimization, I am sure, of the NT memory manager. [echo retained] Does the difference between the two memory readings represent a memory leak? I think that is unlikely. If it were a true memory leak, it would have taken a much geater toll. More like the OS simply lets the program keep memory that it seems to need, thus sparing the overhead of further requests. The second round of loading this hash certainly wnent much more quickly, since Perl could use its own internal allocation routines. Joseph -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]