I am working on a project that is making use of the SAX2XMLReader to
parse XML documents, and I am noticing that the parser is consuming a
large amount of memory, which steadily increases approximately 200K per
sec. This is a problem for me because the product is expected to parse
very large data sources.
I thought that SAX avoided this type of memory consumption since it is
event driven and simply notifies my custom ContentHandler as each type
of XML tag is encountered. Is there something I am missing here?
Should SAX2XMLReader really be using this much memory? For example,
while parsing an ~45M XML document, the memory usage rose to ~14M...
this is not good because the program is expected to possibly handle
documents that are several hundred megs large.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]