Yep, experienced it. The only work around I can think of is to increase again the Xmx parameter.
I hit it not with CVS but with checkstyle report... 114 450 errors generated quite a report :). Not very practical... my browser (Mozilla rules!) never managed to open this 55 megs generated HTML file. The rendering engine just had to much...
I'm not sure if it is leaking memory... but it definitely needs A LOT to run some reports and this is, of course, directly linked with the xml file used as source by the xsl engin.
Anybody ever though about splitting these big reports in multiple pages or is it currently under work? If not, I may contribute my solution once I get it working.
Eric.
Joshua Sherwood wrote:
I have a question about xdoc and performance. I've been working with statcvs to generate my commit log files, and it generates approximately 1000 commit_log*.xml. When xdoc starts its translation it gets slower and slower and after approximately 400 files I get out of memory. If I use MAVEN_OPTS=-Xmx512m I get to about 700 files before getting out of memory. Each of the xml files themselves do not exceed 100kb in size.
It seems that xdoc is leaking memory, or the xml/xsl parsers are leaking. Instead of iterating over the entire directory would the memory performance be better by looping in the jelly code of each of the files?? Anyone else experienced this????
- J
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
