There isn’t a chunking mechanism in
XMLBeans, but I pretty sure you can write a Stax or SAX wrapper (Stax should be
easier) to split the big stream into smaller ones, and you can load a small
subtree into an XMLBean, one at a time.
Would this work for you? What kind of operations
do you
split the file into 2 and process each one separately
- Original Message -
From:
John
Harrison
To: user@xmlbeans.apache.org
Sent: Wednesday, August 09, 2006 1:34
AM
Subject: Re: Parsing large file
So,
this means similar to XMLBeans.
I
The file is stored in meory as is, undecoded
- Original Message -
From: "Erik van Zijst" <[EMAIL PROTECTED]>
To:
Sent: Wednesday, August 09, 2006 1:25 AM
Subject: Re: Parsing large file
Jimmy Zhang wrote:
VTD-XML support max of 2GB files assuming there is enough
look for alternate parse, Unless some one suggest that this can be achieved using XMLBeans!
Regards
John
- Original Message From: Erik van Zijst <[EMAIL PROTECTED]>To: user@xmlbeans.apache.orgSent: Wednesday, 9 August, 2006 9:25:36 AMSubject: Re: Parsing large file
Jimmy Zhang wrote
Jimmy Zhang wrote:
VTD-XML support max of 2GB files assuming there is enough physical
memory
From vtd-xml.sourceforge.net:
"Its memory usage is typically between 1.3x~1.5x the size of the XML
document, with 1 being the XML itself."
Which implies storing it in memory as well?
- O
VTD-XML support max of 2GB files assuming there is enough
physical memory
- Original Message -
From:
John
Harrison
To: user@xmlbeans.apache.org
Sent: Tuesday, August 08, 2006 11:34
PM
Subject: Parsing large file
Hi, I am investigating XML parsers for
6 matches
Mail list logo