Is it possible to apply your transformation using a sequential STX
processor, as an alternative to Xalan? The sequential processing is
inherently limited, but requires only one pass through the XML file without
slurping the entire document into memory first. It is much like XSLT but
using a subset of the features.

-----Original Message-----
From: Mikael Jansson [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 20, 2008 12:32 PM
To: xalan-j-users@xml.apache.org
Subject: Transforming huge XML-files - 3-4GB


Hi! I've already posted this, but it does not show up in the history and
there was no response so I'll post it again. Sorry if it's a double post...

I'm trying to transform a set of XML-files into SQL-code. It works just fine
with XSL no mater what components I use - Stream or SAX or what ever.

But, when the files gets to big, I run out of memory. OutOfMemoryError -
java heap space.

If I use the incremental feature I can transform documents of about 200MB,
whit out it a few MB.

Is there any way I can resolve this? I need to transform XML-fils up to 4GB.

I have tried to expand the java heap with java -XmxXXXM but it's not
sufficient. There has to be a way for the parser to only process one node at
a time, deleting the old ones.

I'm using the latest version of xalan-java 2.7.1. <http://2.7.1./> 


-- 
//Mikael Jansson


Reply via email to