Thanks, Glenn, and thanks, Luis!

A combination of -nocs and -Xmx4096m made the build succeed. The resulting PDF is 5742 pages, which might explain the issues I had. My fop wrapper script looks like this:

#!/bin/sh
export FOP_OPTS=-Xmx4096m
exec /usr/bin/fop -nocs "$@"

BTW, is the -nocs option new to FOP 1.1? It doesn't seem to work with FOP 1.0.

Again, thanks, and have a good day!

Cheers,

Stefan



On 25.10.2012 00:44, Glenn Adams wrote:
You might also try disabling complex script support, about which see [1], if you don't require use of the complex text path. Somewhat greater memory may be consumed when CS is enabled (which it is by default in 1.1).

[1] http://xmlgraphics.apache.org/fop/1.1/complexscripts.html#Disabling+complex+scripts

On Thu, Oct 25, 2012 at 6:03 AM, Luis Bernardo <lmpmberna...@gmail.com> wrote:

You can control how much memory the jvm uses by using the -Xmx flags. So I think you can try that first.

The only situation where I know that FOP runs out of memory (also in a machine with 8 GB) is when you have a very long paragraph (and I mean a paragraph with 200K+ words). Then the line breaking algorithm has to hold the full paragraph in memory and decide on the optimal break points and that very likely will use all available memory. Do you know if you have anything like that in your book? Having thousands of pages by itself should not be an issue. It really depends on the content and on how the line breaks happen (if you insert line breaks FOP uses a lot less memory than if you don't).


On 10/24/12 7:49 PM, Stefan Hinz wrote:
I like FOP 1.1 a lot: Unlike previous versions, it tells you which page it's processing, which can make debugging easier, and also it gives you that warm fuzzy feeling that you're somewhat in control of things. :-)

However, with really big books, I'm hitting a wall, like this:

Okt 24, 2012 8:21:16 PM org.apache.fop.events.LoggingEventListener processEvent
INFO: Rendered page #2630.
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded

That is, FOP stops at about 2600 pages with an out of memory error. On Stackoverflow (http://preview.tinyurl.com/94qute5), there's an indication why this happens:

"This message means that for some reason the garbage collector is taking an excessive amount of time (by default 98% of all CPU time of the process) and recovers very little memory in each run (by default 2% of the heap).
This effectively means that your program stops doing any progress and is busy running only the garbage collection at all time."

Does this mean it's a FOP 1.1 bug, or would there be anything I could do to give it/Java more memory and prevent it from failing?

The error happens on an 8 GB RAM 4-core machine. At the time FOP ended, there was like 2 GB RAM left.



---------------------------------------------------------------------
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org




-- 
Cheers,

Stefan Hinz <stefan.h...@oracle.com>, MySQL Documentation Manager

Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc

ORACLE Deutschland B.V.&  Co. KG
Registered Office: Riesstr. 25, 80992 Muenchen, Germany
Commercial Register: Local Court Of Munich, HRA 95603
Managing Director: Jürgen Kunz

General Partner: ORACLE Deutschland Verwaltung B.V.
Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
Managing Directors: Alexander van der Ven, Astrid Kepper, Val Maher
--------------------------------------------------------------------- To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org

Reply via email to