Hi, I have a simple pipeline I'm using for some database->xml->xsl->xml->client processing. I start of with esql creating an xml document, then process it with some xsl, then output using the xml serializer. I'm processing a lot of data, sometimes 100M at a time. Maybe cocoon is not designed for this.
anyways, doing a 60M file causes my memory usage to jump up ~400M each time, and it doesn't go down. Eventually (pretty quickly actually) my system grinds to a halt. It happens when I add in the xsl step, no matter what xsl stylesheet I use to perform the translation. I am using noncaching stream and event pipelines, and have incremental processing turned on. all other stuff is default. two questions: 1) any suggestions? 2) I've tried unsuccessfully to use a different xsl transformer, both saxon7 and the latest Xalan, in hopes that it is a problem that comes with Xalan 2.3.1 (that ships with cocoon 2.0.4). Can someone give me some tips on how to switch transformers? my setup... RedHat 7.3 (kernel 2.4.18-24.7.x) sun jdk 1.4.1_01 tomcat 4.1.18_LE_jdk1.4 cocoon 2.0.4 tomcat and cocoon are binary installs. JAVA_OPTS=-Xmx256M (tho I've played with several settings and no success). Thanks in advance, Simon --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]