Phil,
Do you have any profiling ability to see what objects are accumulating ??
If you have streaming mode on, the default is on, memory consumption should be almost
none once the first row has been read.
Can you explain more how the batches work. i.e. Are you using one XSL file that does
all the work or are you creating a transformer that runs multiple times.
The other changes should be in CVS soon. I just started using Eclipse to be my CVS interface and
I made the critical mistake of checking out of cvs-public which is read only so now I am in the process
of doing a little code surgery.
-JG
Phil Friedman wrote:
After a bit of testing, we've found that transforming larger documents (more sql batches) the JVM requires much more memory for the same SQL as Xalan 2.0.1. We're using -Xmx1024m but we still get java.lang.OutOfMemoryError. While we can transform the same file (as well as 5x larger files) in 2.0.1 and the JVM never grows beyond 128MB to complete. Any ideas?
Moraine Didier wrote:
Hem ... You're right Phil, I have problem with my previous code also.
This one seems to work better ;-)
public void close(Object sqldoc) throws SQLException {
if (DEBUG)
System.out.println("Entering XConnection.close(" + sqldoc + ")");
DTMNodeIterator dtmIter = (DTMNodeIterator)sqldoc;
XNodeSet xNS = (XNodeSet)dtmIter.getDTMIterator();
OneStepIterator iter = (OneStepIterator)xNS.getContainedIter();
DTMManager aDTMManager = (DTMManager)iter.getDTMManager();
SQLDocument sqlDoc = (SQLDocument)aDTMManager.getDTM(xNS.nextNode());
sqlDoc.close();
}
-- -------------------------------------- John Gentilin Eye Catching Solutions Inc. 18314 Carlwyn Drive Castro Valley CA 94546
Contact Info [EMAIL PROTECTED] Ca Office 1-510-881-4821 NJ Office 1-732-422-4917
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]