So, this means similar to XMLBeans.
 
I undersand the reason why XMLBean does this( Loading everything in memory) to preserve the sequence so that it is easy to traverse back and forth.
 
But for me it is too every expensive to put everything in memory.(500,000-750,000 objects)
I guess I have look for alternate parse, Unless some one suggest that this can be achieved using XMLBeans!
 
Regards
John
 
----- Original Message ----
From: Erik van Zijst <[EMAIL PROTECTED]>
To: user@xmlbeans.apache.org
Sent: Wednesday, 9 August, 2006 9:25:36 AM
Subject: Re: Parsing large file

Jimmy Zhang wrote:
> VTD-XML support max of 2GB files assuming there is enough physical
> memory....

From vtd-xml.sourceforge.net:

"Its memory usage is typically between 1.3x~1.5x the size of the XML
document, with 1 being the XML itself."

Which implies storing it in memory as well?

>     ----- Original Message -----
>     *From:* John Harrison <mailto:[EMAIL PROTECTED]>
>     *To:* user@xmlbeans.apache.org <mailto:user@xmlbeans.apache.org>
>     *Sent:* Tuesday, August 08, 2006 11:34 PM
>     *Subject:* Parsing large file
>
>     Hi,
>       I am investigating XML parsers for parsing huge(around 2GB XML file).
>     Each file can have around 500,00-750,000 objects.
>     Due to theses numbers it is not practicle to store everything in memory.
>     Does XMLBeans provides any facilities to parse such huge files?
>
>
>     Is there any thing that XMLBean provides to parse file in stages
>     from IOStream?
>
>     Any input on this is much appreciated.
>
>     Regards
>     John.

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to