Philippe Pithon wrote:
By using more of fo:page-sequence
I shall have no more problem of crash or slowness?
Even if my file xml is 5 Mb?
It depends. FOP is already capable of streaming processing,
this means after a page has been rendered, the source and
intermediate data necessary for the rendering are discarded.
Of yourse, these attempts at conserving ressources are
thwarted by using forward references in the source, in
particular TOCs at the beginning of the document and the
ubiquitous "page N of TOTAL". Obviously, has to rerender
pages with such constructs after it has resolved the
references, and all pages in between will be held in memory
for simplicity. Rendering the document in several page
sequences wont help you if you still refer to the last page
from somewhere near the beginning.
If you have problems rendering large documents, check for
forward references first, remove them and check whether
it is better afterwards. Dividing your document in several
page sequences is only a second step, but also helpful in
many situations.
As a last note, the size of the XML or FO source can be
rather uncorrelated to FOP memory problems for other reasons,
for example i had a 40MB FO document once which consisted
mostly of ignorable or collapsable white space (it rendered
to a 3MB PDF).
J.Pietschmann