Adrian Sobotta wrote:

<snip/>

My problem however is that it slowly it eats up more and more memory. So after 500 pdf's or so its used more then the 500MB that I'm allowing the JVM to use and it dies with the following error:



Exception in thread "main" java.lang.OutOfMemoryError

        <<no stack trace available>>


Have you read through the advice on memory:

http://xml.apache.org/fop/running.html#memory



Has anyone else successfully generated many PDF's in a batch like this? And if so did you have to do something to keep the memory from creeping upwardly? Does anyone have any suggestions about what I can do to make sure each iteration of the loop and hence each PDF doesn’t result in things being left in memory?

Yes we can sucessfully generate many thousands of documents without memory blowing up. If memory is gradually increasing then it does point to a memory leak somewhere. Check your own code throughly for unclosed outputstreams/database connections and other objects you might be holding onto.


Chris



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to