I am parsing 120K of XML into a document and then running
def get_nodes(node, namespace)
self.find("./dn:#{node}", "dn:#{namespace}")
end
several times.
Memory usage for my test driver sits at 20 megs if I run get_nodes less than
10 times. If I run get_nodes 1000 times my memory usage jumps from 20 megs
to around 140 megs and does not come back down until the process exits. If
I force a GC.start at the end of each loop I can keep the memory usage down
but that is not practical in the real world where I need this code to be at
least somewhat fast.
I am only building the document once during the entire duration of the test
program so the parsing of the large string should not be a problem.
Any ideas as to why my memory usage grows and then never comes down?
I am running ruby 1.8.6 and libxml-ruby .8.3 with libxml 2.6.32.
Thank you,
Matt Margolis
_______________________________________________
libxml-devel mailing list
[email protected]
http://rubyforge.org/mailman/listinfo/libxml-devel