Hi Adrian,

yes, seams like the whole content is wrapped in one big
fo:page-sequence. So there is room for improvement :)

I will write to the DocBook XSL mailinglist to get some tip how I can
split it in more fine grained fo:page-sequence junks.

Cheers and thanks,
Tobias

On Tue, May 5, 2009 at 5:56 PM, Adrian Cumiskey <[email protected]> wrote:
> Hi Tobias,
>
> Do you use many page sequences in your FO documents?  FOP processes
> documents by page sequence so if you can section the document contents into
> lots of small page sequence segments this can help a lot with memory
> consumption.
>
> Good luck!
>
> Adrian.
>
> Tobias Anstett [k15t.com] wrote:
>>
>> Hi,
>>
>> We are using Apache FOP inside a web-app deployed within Apache
>> Tomcat. When the FO documents get to big (~20MB) we are getting out of
>> memory exceptions (java heap space overflows) while transforming FO to
>> PDF. We already assigned more than 1024m to the JVM but still get not
>> rid of this problem.
>>
>> I already checked the
>> http://xmlgraphics.apache.org/fop/0.94/embedding.html#performance
>> documentation but it seams that we did everything like suggested. Any
>> ideas?
>>
>> Cheers,
>> Tobias
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>>
>>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to