Jeremias Maerki a écrit :
And my next problem is to find a way to force memory recycling after this long and hefty FOP processing, but until further investigated this is OT ;-)

You probably didn't get my hint earlier but with the new image loading
framework you should actually get away with lower memory settings. In my
tests I have been able to produce PDF with little text and many images
with 40MB of VM memory which wasn't possible with 0.94 and earlier.

Well, I got the hint, but it seems in contradiction with what I read.
So to take the picture from a bit higher:
- all XSL-FO transformation + FOP generation now work OK.
- this generates 20-30 documents (chapters) for a grand total of about 150 Mb, to be bound together by iText.
- source XML is 1.5 Mb
- 1011 PNG images for a total of 151 Mb, the largest image is 715 kb.

Now the figures:
- XML -> XSL-FO transformation + FOP generation take 15 minutes on a pretty decent DELL Server (running Debian 4.0) having all the physical RAM possible (staging server for several customers) - JVM has 2000 Mb (which is BTW the grand max on this processor/server/OS/JVM architecture)
- only one instance of FOP launched (one document generation)
- the second next step in the publication process (opening the 150 Mb with iText to add the bookmarks) fails immediately (at file open) saying it cannot allocate memory

If I try to investigate memory usage using Runtime.getRuntime().getFreeMemory() and logging the figures with log4j, these are the figures I get:
- before XSLT + FOP: 1900 Mb free/2000 Mb
- end of XSLT + FOP: 241 Mb free
- set FopFactory instance to null as a desperate hint to the GC that FOP objects could be/should be recycled - I force garbage collection using System.gc() [OK, in an application server this is a brute force approach, but could not see a more clever maneuver ATM]
- 350 Mb free/2000 Mb total
- Bind all chapters with iText
- 250 Mb free
- Force another GC
- 350 Mb free again (so the binding operation has no effect on the available memory).
- the next iText step still fails.

Now I don't take runtime.getXXXMemory() for bible word but at least it "looks like" the Xalan + FOP subsystem hogs 1500 Mb of RAM which I cannot recover. So I hired the team member who's competent in profiler usage next week but I must say at the moment I'm still stuck :-(

Of course I've made my homework and read the f...riendly manual before daring to ask.
Did I miss any important indication ?

Reply via email to