Hi Henry, first let me thank for your quick and kind response.

Following are my resposnses and comments:

> Are you able to reproduce the failure if you do not use the incremental 
> processing feature of Xalan-j?
Yes, actually if I don't use Incremental, the memory runs out much faster.

> Are you able to reproduce it using the org.apache.xalan.xslt.Process command?
Yes. This was my first approach. Since I had this memory problem I thought of 
making my own program to tranform the XML file, which end up with the same 
result.

> "...it's hard to say for certain without knowing how many elements, 
> attributes and character data are packed into those 200000 lines"
Follows a sample line:
<EDR><AC>935103</AC><MSDN>967422999</MSDN><SRTATT1>APN6</SRTATT1><SRTATT2/><SRTATT3/><SEQ>28</SEQ><ES>268067400022999</ES><CSID>1</C
SID><ETID>5</ETID><CDTM>20050417200306</CDTM><EDTM>20050414121051</EDTM><ECMNY>0</ECMNY><MFID>3068959</MFID><RNUM>151</RNUM><CCID/><
TID>39</TID><E1>967422999</E1><E2>PPP</E2><E3>268067400022999</E3><E4>mmsc.tmn.pt</E4><E5>QOSC</E5><E6>30</E6><E7>N
 - Normal | N - N
o Time Rate</E7><E8>No Time 
Rate</E8><E9>APN4</E9><E10>APN6</E10><E11>13</E11><E12>12912</E12><E13>820</E13><E14>13732</E14><E15>0</
E15><E16>2149651315</E16><E17>195.8.28.239</E17><E18>7977</E18><E19>0</E19><E20>0</E20><E21>1</E21><E22>0</E22><E23/><E24>20000</E24
><LP/><CCMNY/><ICMNY/><ECMNY/><FN/><IMCMNY/><IMCC/><RDUT/><PDCMNY/><HPDID/><TOID/><UCID/><UCID/><OACT/><RN/><E25/><E26/><E27/><E28/>
<E29/><E30/><E31/><E32/><E33/><E34/><E35/><E36/><ER>01560F00000029A</ER><MB>F</MB><HPPS/><RCLID>-1</RCLID><RCID>5</RCID><DD/><PER/><
F2/></EDR>
Also follows the xsl I'm using:
=====================
<xsl:output method="text"/>
<xsl:template match="/EDR[child::ETID='5']">
        <xsl:value-of select="AC"/>|<xsl:value-of 
select="EV"/>||||<xsl:value-of select="E22"/>||<xsl:value-of 
select="E2"/>|<xsl:value-of select="E6"/>|<xsl:value-of 
select="EDTM"/>|<xsl:value-of select="E11"/>||<xsl:value-of 
select="E18"/>||<xsl:value-of select="E9"/>|<xsl:value-of 
select="E5"/>|||<xsl:value-of select="ceiling(E12 div 1000)"/>|<xsl:value-of 
select="ceiling(E13 div 1000)"/>|0|<xsl:value-of 
select="E9"/>|0|0|<xsl:value-of select="E19 div E20"/>|0|<xsl:value-of 
select="ECMNY div 100000"/>|0|<xsl:value-of select="HPDID"/>||<xsl:value-of 
select="HPDID"/>|<xsl:text>&#10;</xsl:text>
        </xsl:template>
=====================

Using this comand line: java -cp xalan.jar org.apache.xalan.xslt.Process -in 
bigFile.xml -xsl GPRS2TX.xsl it starts consuming about 64MB until, either it 
finishes or the memory runs out, which in my current system (sun sparc) is 
about 1.3GB.


Best regards
Higino Silva

-----Original Message-----
From: Henry Zongaro [mailto:[EMAIL PROTECTED] 
Sent: segunda-feira, 2 de Maio de 2005 19:11
To: [EMAIL PROTECTED]
Cc: Higino Silva, Eng�
Subject: Fw: Help using the Xalan mailing lists

[Forwarding for Higino Silva, who is having difficulty posting to 
xalan-j-users.]

Hi, Higino.

     I don't see anything wrong with your program.  I do not believe that this 
is a known problem.

     Are you able to reproduce the failure if you do not use the incremental 
processing feature of Xalan-j?  Are you able to reproduce it using the 
org.apache.xalan.xslt.Process command?  200000 lines doesn't sound like it 
should consume 1.34 GB of memory, but it's hard to say for certain without 
knowing how many elements, attributes and character data are packed into those 
200000 lines.

Thanks,

Henry
------------------------------------------------------------------
Henry Zongaro      Xalan development
IBM SWS Toronto Lab   T/L 969-6044;  Phone +1 905 413-6044
mailto:[EMAIL PROTECTED]
----- Forwarded by Henry Zongaro/Toronto/IBM on 2005-05-02 01:39 PM -----

Higino Silva, Eng� <[EMAIL PROTECTED]>
2005-05-02 11:59 AM

To
Henry Zongaro/Toronto/[EMAIL PROTECTED]
cc

Subject
Help using the Xalan mailing lists




Hi,
I'm using Xalan to convert big XML files. Following is a simplified version of 
the code used: 
                        StreamResult strResult = null; // Shut up compiler 

                        if( conv.outFileName == null ) 
                                strResult = new StreamResult(System.out); 
                        else 
                                strResult = new StreamResult(new 
File(conv.outFileName)); 
                        strResult.setSystemId(conv.outFileName); 
                        // Use a Transformer for output 
                        // Use the TransformerFactory to instantiate a 
Transformer that will work with  
                        // the stylesheet you specify. This method call also 
processes the stylesheet 
                        // into a compiled Templates object. 
                        TransformerFactory tFactory = 
TransformerFactory.newInstance(); 
                        tFactory.setAttribute("
http://xml.apache.org/xalan/features/incremental";,
java.lang.Boolean.TRUE);
                        Transformer transformer = tFactory.newTransformer(new 
StreamSource(xslFileName)); 
                        BufferedReader br = new BufferedReader(new 
FileReader(conv.inputFileName)); 
                        InputSource inputSource = new InputSource(br); 
                        XMLFilereader gsmReader = new UBEEFileReader(); 

                        // Use the parser as a SAX source for input 
                        SAXSource source = new SAXSource(gsmReader, 
inputSource); 
                        transformer.transform(source, strResult); When I'm 
aplying this code to tranform an input file with let's say 200,000 lines the 
process instance starts comsuming arround 64M of memory and keeps increasing to 
1.34GB untill is crashes with out of memory.
Can someone explain what I'm doing wrong or how can I prevent this memory 
creep? 


Besta regards
Higino Silva 

Reply via email to