You can read the following two related articles from the archives:

http://marc.theaimsgroup.com/?l=fop-dev&m=100034658526437&w=2
http://marc.theaimsgroup.com/?l=fop-dev&m=100482704631972&w=2

Your XML files sound huge, but if I remember correctly Mark Lillywhite has
had success with very large XML files.

Regards,

Mike

-----Original Message-----
From: Maring, Steve [mailto:[EMAIL PROTECTED]]
Sent: 9. november 2001 15:29
To: FOP dev list (E-mail)
Subject: FOP memory usage


I'm using fop-0.20.1.

I started with a 650KB XML file that I transformed into a 4MB XSL:FO file.
Running this file through FOP to generate a PDF used about 90MB of memory.

Initial heap size: 807Kb
Current heap size: 91637Kb
Total memory used: 90829Kb
  Memory use is indicative; no GC was performed
  These figures should not be used comparatively
Total time used: 31265ms
Pages rendererd: 17
Avg render time: 1839ms/page


I have XML files in excess of 15MB that need to be converted to PDF.
Assuming that a linear extrapolation is possible, it would suggest that the
JVM running the FOP process would need in excess of 2GB of memory for this
to avoid the dreaded java.lang.OutOfMemoryError.

Are there any optimizations that can be done to FOP?

Thanks.
-Steve Maring
Nielsen Media Research

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]

Reply via email to