Hi list,

I have been using fop for some time now to produce small documents. Now I 
have to generate medium size (50-500 pages) ones and at my surprise, I do 
have a memory footprint problem.

I have managed to use as much page-sequence I could to "reduce" the fop's 
needed memory as this list suggests, and yet for a 80 pages document with 7 
long page sequences wich contain 11 pages each I need 115 Mo for the JVM...
I have a TOC in a previous page-sequence that use page-number-citation that 
reffer to each of the big page-sequence start. Maybe that's where my 
memory  goes mad...

is there some option I am not using ?

I thought there was a buffering option that could do the trick but the -buf 
buff.file did nothing.

I a sure fop can do better than this.

Could anyone point me to some info about this please ?

Thanks

Cyril Rognon


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]

Reply via email to