The trouble I'm running into is, when parsing, a buffer in memory for the characters in this tremendous block of data is being maintained, and is grown when necessary by XMLBuffer::insureCapacity. This buffer gets so large that at some point, the allocation in insureCapacity fails, and parsing can't continue. What I'd like to be able to do is, specify to Xerces that it should buffer up only a certain maximum amount of character data at a time before calling sendCharData (in IGXMLScanner::scanCharData), rather than waiting until it has everything.
As far as I can tell, there isn't a way to do this currently. But I'd like some feedback as to how easily people think this might be implemented, whether it's reasonable to do so, etc., and (as a newbie to the Xerces codebase) hopefully get some assistance in implementing it.
I'm quite interested in this as well. I asked this question a few months ago and didn't get a response. I tend to work with very large XML streams that have substantial chunks of data Base64 encoded as character data. All I'd really like to be able to do is set the character buffer size to, say, 4k and not allow it to grow beyond that. This would obviously be for the SAX parser.
Sean
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
