I expect the original author split up the 1700 pages document so that it was
1700 documents each of 1 page. This is doable if you know where the page breaks are,
but not if you have to predict them yourself.
Alex
--- Cyril Rognon <[EMAIL PROTECTED]> wrote:
> Do you mean you achieved this with
Do you mean you achieved this with the FOP0.20.2RC distribution ?
I am very much interested by your answer :)
Cyril
At 12:30 07/01/2002 -0800, you wrote:
>I had the same problem. I was able to break the document into 1 page long
>page-sequences and have been able to produce a 1700+ page documen
, 2002 7:17 AM
To: [EMAIL PROTECTED]
Subject: Re: How can i optimize memory consumtion?
Does it really work ?
I have tried with my own documents, but it seems the memory usage still
depends on the total page number.
Maybe I use something that bugs memory. I would be glad to hear if it works
for
Does it really work ?
I have tried with my own documents, but it seems the memory usage still
depends on the total page number.
Maybe I use something that bugs memory. I would be glad to hear if it works
for you, Luigi.
Cyril
At 06:52 07/01/2002 -0800, you wrote:
>Try breaking your document
Try breaking your document into multiple page-sequences.
Luigi Savini wrote:
>
> I need to produce a very large PDF document (about 1300 pages!), no images,
> just plain text.
> I set JVM memory parameters (Xms and Xmx) but i can't process this document
> anyway.
>
> Did anyone try to modify so