users who are elaborating big big data. Maybe it
should be added to the Fop website under the memory consumption notes? Just
a suggestion.
Thanks again to everyone!
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequence-blocks
aemitic wrote:
Thanks for the suggestion.
This /workaround/ (it's not a solution) cannot be applied. Why:
- internal pdf links would not work
- pdf bookmarks would not work
- page numbering would not be correct
- creating over 15 PDFs and then merging them with an external tool is
Subject: Re: FOP memory growing with a lot of page-sequences
Hi
e.g. we have a banking customer where account statements are produced and sent
for printing. A single PDF has around 100k pages.
Maruan
Am 19.04.2013 um 11:04 schrieb Paul Womack
supp...@papermule.co.ukmailto:supp...@papermule.co.uk
that this
transmission is virus-free and will not be liable
From: Maruan Sahyoun [sahy...@fileaffairs.de]
Sent: 19 April 2013 11:56
To: fop-users@xmlgraphics.apache.org
Subject: Re: FOP memory growing with a lot of page-sequences
Hi
e.g. we have a banking customer where account
: Re: FOP memory growing with a lot of page-sequences
no - that's one document with 100k pages
Maruan Sahyoun
Am 19.04.2013 um 13:04 schrieb Kerry, Richard
richard.ke...@atos.netmailto:richard.ke...@atos.net:
Surely that's 100k documents, each with one page (or a small number of pages
Kerry, Richard wrote:
Yes I did understand what you wrote, and the earlier correspondent.
I was attempting to ask why.
It seems illogical to treat many small documents as one large one. Especially
if the large one is so large that it is hard to process.
That case sounds (to me) like many
...@fileaffairs.de]
Sent: 19 April 2013 12:04
To: fop-users@xmlgraphics.apache.org
Subject: Re: FOP memory growing with a lot of page-sequences
no - that's one document with 100k pages
Maruan Sahyoun
Am 19.04.2013 um 13:04 schrieb Kerry, Richard richard.ke...@atos.net:
Surely that's
@xmlgraphics.apache.org
Objet : Re: FOP memory growing with a lot of page-sequences
well - it is one print job, has paperhandling marks, goes to an envelope
stuffing machine
Maruan Sahyoun
Am 19.04.2013 um 14:56 schrieb Kerry, Richard richard.ke...@atos.net:
Yes I did understand what you wrote
Hi,
I've created some pretty large PDFs (e.g 500+ pages) with a lot of graphics.
IMO, although memory usage increases, eventually Java's Garbage Collection
will free up your memory.
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot
in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequences-tp38355.html
Sent from the FOP - Users mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: fop-users-unsubscr
@Bonekrusher
The test application I created and put in the StackOverflow question (linked
in the post above) clearly shows how the Java's Garbage Collection *can't
and does not* free up the memory.
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing
@Glenn Adams-2
Could you please elaborate more?
Transforming multiple XSL-FO files with FOP would mean to produce multiple
XSL-FO files, wouldn't it?
I need to produce a single PDF.
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page
.
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequence-blocks-25-tp38355p38360.html
Sent from the FOP - Users mailing list archive at Nabble.com
a performance point of view
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequence-blocks-25-tp38355p38362.html
Sent from the FOP - Users mailing list archive at Nabble.com
tool is
unacceptable from a performance point of view
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequence-blocks-25-tp38355p38362.html
Sent from the FOP - Users mailing list archive at Nabble.com
numbering would not be correct
- creating over 15 PDFs and then merging them with an external tool is
unacceptable from a performance point of view
--
View this message in context:
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequence-blocks-25
://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequences-tp38355.html
Sent from the FOP - Users mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
Dear All,
I use FOP (1.0) for processing a simple two page FO file with one complex
SVG file on each page.
In my web app (bean) I instantiate the following Producer class once and
then I call its createPDF method several times. I get very quicky Heap Space
errors. In the NetBeans profiler I've
And what happens if you destroy/reconstruct the Producer?
On 02/25/2012 01:46 AM, honyk wrote:
Dear All,
I use FOP (1.0) for processing a simple two page FO file with one complex
SVG file on each page.
In my web app (bean) I instantiate the following Producer class once and
then I call its
How do I know how much memory it needs?
I wrote a Java program which should generate a PDF of about 10 pages
using FOP 1.0 but with many xsl:text tags per page.
It runs in my 64 bit Eclipse on a Windows Vista machine with 4GB RAM.
It gets out of memory exception on my 32 bit Eclipse on a Windows
If you've installed the JDK (not just the JRE), you have a number of
tools in the bin directory of the JDK installation. With jvisualvm
or jconsole you can attach to a Java process and inspect the memory
consumption. But the values shown there may not be representative since
Java uses Garbage
On Apr 8, 2008, at 22:01, Andreas Delmelle wrote:
Just noticed a small error, so slight correction:
I have received confirmation in an off-list discussion that one and
the same document, rendered with FOP 0.94:
This should be 0.20.5, actually... There /are/ a few Bugzilla entries
that
tree until I added
coded to see the intermediate format file. Is there anything else that can
be done to optimize this process??
--
View this message in context:
http://www.nabble.com/FOP-memory-issues-in-servlet-on-Websphere-tp16558704p16558704.html
Sent from the FOP - Users mailing list
On Apr 8, 2008, at 15:53, egreene wrote:
We are having memory/timeout issues on our production Websphere box
using FOP
as an alternative (in the future the only) process to creating PDFs
from XML
data. The application seemed to have passed load testing in stage
but in
production required
On Apr 8, 2008, at 22:01, Andreas Delmelle wrote:
snip /
24 page sequences (w/ even and odd sequence masters on sequences
other than
blank and cover pages):
- 2 sequences are table of contents
- 1 sequence is a back of the book index
- 2-3 static content sequences
- Other sequences are
On Jan 4, 2007, at 21:34, Cliff wrote:
Hi Cliff,
I'm facing FOP memory issues that I fought with a long time ago. I
lost the
battle back then and had to resort to some ugly manual page
breaking logic
and now I'm wondering the current status of the FOP project.
My immediate question is:
Do
for a single report
run by a single user. Maybe I can come up with something else clever in the
interim. Thank you for all of your help.
Cliff
Hollaback...
http://codeforfun.wordpress.com
Andreas L Delmelle wrote:
On Jan 4, 2007, at 21:34, Cliff wrote:
Hi Cliff,
I'm facing FOP memory issues
.
Cliff
Hollaback...
http://codeforfun.wordpress.com
Andreas L Delmelle wrote:
On Jan 4, 2007, at 21:34, Cliff wrote:
Hi Cliff,
I'm facing FOP memory issues that I fought with a long time ago. I
lost the
battle back then and had to resort to some ugly manual page
Hello all,
I'm facing FOP memory issues that I fought with a long time ago. I lost the
battle back then and had to resort to some ugly manual page breaking logic
and now I'm wondering the current status of the FOP project. My immediate
question is:
Do either FOP 0.20.5 or FOP Trunk address
Hi Cliff,
I have to start the JVM with java -Xmx256m ... to generate my
documents. With this option you increase the heap size of the VM.
Bye
Stefan
Cliff schrieb:
Hello all,
I'm facing FOP memory issues that I fought with a long time ago. I lost the
battle back then and had to resort
30 matches
Mail list logo