Most of the time, I try to solve such problems with the JVM option, -Xmx.

Depending upon physical memory in the target platform, you could try
to set a value for option, -Xmx. Try setting an option like, -Xmx800m
or say 1 to 2 gigs. This usually works for me.

btw, this is a problem not specific to Xerces. It does happen, with
any Java library, which consumes large runtime memory.

On Fri, Feb 19, 2010 at 9:21 PM, Torsten <[email protected]> wrote:
> Hi,
>
> I'm using Xerces to parse XML files. However, the mechanism does not work 
> anymore if the XML file grows too large.
>
> For example, if I'm parsing a ~50MB file the memory consumption of the jvm 
> grows to ~ 400MB and after about 10 mins I get a stackoverflow error inside 
> Xerces / Xalan with no hint to the code position (see below).
>
> Is that a known behaviour, i.e. is it impossible to access files that large, 
> or can this be circumvented by changing the code?
>
> Thanks a lot!


-- 
Regards,
Mukul Gandhi

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to