Hi,
yes, I know there are workarounds. For me it is important to use the
XSL:FO-Implementation as standard as possible. At the moment we decided not to work
with the sources ourselves for programming-capacity and strategical reasons (for me it
makes no sense if a houndred programmers implement the same feature separately).
The current pre-release needs about 1 MB RAM per page in our reports - that is indeed
no problem if you have a machine with about 256 MB or more and have no other
applications loaded while using FOP.
But we're using FOP as add-on to some of our applications and the PCs of the users
have 128 MByte maximum and we have to tune each machine carefully with the -Xmx
Parameter (otherwise FOP seems to hang in some endless loops). If you do this on a
stand-alone machine: 32 MB for our reporting engine which produces the .fo-File + 32
MB for Adobe Acrobat or the FOP-Preview + n MB for FOP...
If there would be a way to get the same results with about 512 kByte per page, that
would be a big advantage.
The fact is, that we have to (and customers/users simply do) compare the need of
memory to other reporting tools as Crystal Reports or commercial
XSL:FO-implementations, which use much less memory and are faster.
So if there's a way to implement the suggestions for lean tables without refactoring
the whole thing, I'd suppose to do so.
It might be a fundamental decision if FOP is a kind of toolbox for developers or if
it should be an out of the box-product for nearly everyone - I think there's so much
good ideas in it that everyone should be able to use it.
Thomas Sporbeck
Gesendet am: 08.07.2003 11:34:58
Betreff: RE: 0.20.5 release
Hi Fopers,
I can understand your requirements, but I would like to know what memory
limit you are looking for and what are the filters you two are talking about.
As for me, I have been using FOP for BIG reports (fromm 100 to 2000 pages)
with big tables (like you, more than 500 pages long tables).
I have used some iText Features to deal with forward reference (see the
list archive for more details) and this has been giving me a nice solution.
I can produce a 1500 pages doc on a simple machine with 256Mo in a few
minutes (yes, it swaps) and we use 1 or 2 Go Ram servers for huge documents.
Anyway, we all would welcome some new solution to this problem, but surely
you reckon there has been loads of workarounb in this list ?
Can you be more specific about the performance threshold you are looking for ?
Regards
Cyril
At 09:02 08/07/2003 +0430, you wrote:
Dear Thomas Sporbeck
It's good to see someone else is using FOP for big reports. I also using
tables for inventory lists near to 600 pages and my user do not accept
to use filters. This FOP is killing my user business and if I could not
find a solution to it, we would trough away the FOP for good, for ever.
Then it would be a shame on FOP open source developers since I would go
and buy none open, commercial product.
I would really appreciate if you inform me of your ideas.
Regards
Ali Farahani
-Original Message-
From: Thomas Sporbeck [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 19, 2003 3:41 PM
To: [EMAIL PROTECTED]
Subject: RE: 0.20.5 release
I would agree to Ricardo. We're using tables for inventory lists
containing about 500 pages. The memory situation in that reports is
really critical and we cannot force the users to set filters.
On the other hand: to us it doesn't matter if this enhancement comes
with 0.20.5 or with a later version (0.20.5a ?), which has of course to
be decided by the developers and will possibly delay refactoring.
Thomas Sporbeck
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]