Hi Thomas,
 
I've got the same problem.. and it seams to me, that it is *not* a problem 
of ulimit. The xerces just bounces, in my task with xml-files > 50~60MB and 
ulimit ~ 2GB.
Another effect is,  that the memory consumed by the dom-parser was aprox  ~7 * 
xml-file-size.
 
Well, and if xml-files grow and grow .. 
My solution was, to use the SAX-Api for XML-Processing of these large files.
(With some ..hm.. nice?.. helper-classes, to have a more convinient access to 
parts 
of the xml document)
 
ciao
martin
 
 
________________________________

Von: thomak [mailto:[EMAIL PROTECTED]
Gesendet: Do 2006-08-24 12:39
An: [email protected]
Betreff: xercesc dom input structure size limit, RAM limit?




Hi,

I tried to read large XML files using DOM on HP, using version .27.
Until 20 MB it works fine, but there is a limit, that I get,
when the structure gets bigger, it means also, the file is mostly bigger
too.
Problem:
During reading, the Reader (see below) quit reading returning exception.
It happens, when the process acheived the mark of 674 MB of used RAM.
I tried it on different machines and it is allways the same.
Is there a memory limit for xerces?

Thanks and greetings

Thomas

parser->setValidationScheme( XercesDOMParser::Val_Auto ) ;
  parser->setDoNamespaces( false ) ;
  parser->setDoValidation(true);
  parser->setLoadExternalDTD( false ) ;
  ErrorHandler* errHandler = (ErrorHandler*) new HandlerBase();
  parser->setErrorHandler(errHandler);
  
  try{  

    INFO("XML file to parse[" << xmlFile << "]\n");
    parser->parse( xmlFile.c_str() ) ;
--
View this message in context: 
http://www.nabble.com/xercesc-dom-input-structure-size-limit%2C-RAM-limit--tf2157920.html#a5961473
Sent from the Xerces - C - Users forum at Nabble.com.



Reply via email to