RE: 0.20.5 release

2003-07-08 Thread Cyril Rognon
Hi Fopers,

I can understand your requirements, but I would like to know what memory 
limit you are looking for and what are the filters you two are talking about.

As for me, I have been using FOP for BIG reports (fromm 100 to 2000 pages) 
with big tables (like you, more than 500 pages long tables).

I have used some iText Features to deal with forward reference (see the 
list archive for more details) and this has been giving me a nice solution.

I can produce a 1500 pages doc on a simple machine with 256Mo in a few 
minutes (yes, it swaps) and we use 1 or 2 Go Ram servers for huge documents.

Anyway, we all would welcome some new solution to this problem, but surely 
you reckon there has been loads of workarounb in this list ?

Can you be more specific about the performance threshold you are looking for ?

Regards

Cyril

At 09:02 08/07/2003 +0430, you wrote:
Dear Thomas Sporbeck

It's good to see someone else is using FOP for big reports. I also using
tables for inventory lists near to 600 pages and my user do not accept
to use filters. This FOP is killing my user business and if I could not
find a solution to it, we would trough away the FOP for good, for ever.
Then it would be a shame on FOP open source developers since I would go
and buy none open, commercial product.
I would really appreciate if you inform me of your ideas.

Regards

Ali Farahani

-Original Message-
From: Thomas Sporbeck [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 19, 2003 3:41 PM
To: [EMAIL PROTECTED]
Subject: RE: 0.20.5 release
I would agree to Ricardo. We're using tables for inventory lists
containing about 500 pages. The memory situation in that reports is
really critical and we cannot force the users to set filters.
On the other hand: to us it doesn't matter if this enhancement comes
with 0.20.5 or with a later version (0.20.5a ?), which has of course to
be decided by the developers and will possibly delay refactoring.
Thomas Sporbeck


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]


[FW:] RE: 0.20.5 release

2003-07-08 Thread Thomas Sporbeck
Hi,

yes, I know there are workarounds. For me it is important to use the 
XSL:FO-Implementation as standard as possible. At the moment we decided not to work 
with the sources ourselves for programming-capacity and strategical reasons (for me it 
makes no sense if a houndred programmers implement the same feature separately).
The current pre-release needs about 1 MB RAM per page in our reports - that is indeed 
no problem if you have a machine with about 256 MB or more and have no other 
applications loaded while using FOP.
But we're using FOP as add-on to some of our applications and the PCs of the users 
have 128 MByte maximum and we have to tune each machine carefully with the -Xmx 
Parameter (otherwise FOP seems to hang in some endless loops). If you do this on a 
stand-alone machine: 32 MB for our reporting engine which produces the .fo-File + 32 
MB for Adobe Acrobat or the FOP-Preview + n MB for FOP...
If there would be a way to get the same results with about 512 kByte per page, that 
would be a big advantage.
The fact is, that we have to (and customers/users simply do) compare the need of 
memory to other reporting tools as Crystal Reports or commercial 
XSL:FO-implementations, which use much less memory and are faster. 
So if there's a way to implement the suggestions for lean tables without refactoring 
the whole thing, I'd suppose to do so. 
It might be a fundamental decision if FOP is a kind of toolbox for developers or if 
it should be an out of the box-product for nearly everyone - I think there's so much 
good ideas in it that everyone should be able to use it.

Thomas Sporbeck

Gesendet am: 08.07.2003 11:34:58
Betreff: RE: 0.20.5 release


Hi Fopers,

I can understand your requirements, but I would like to know what memory 
limit you are looking for and what are the filters you two are talking about.

As for me, I have been using FOP for BIG reports (fromm 100 to 2000 pages) 
with big tables (like you, more than 500 pages long tables).

I have used some iText Features to deal with forward reference (see the 
list archive for more details) and this has been giving me a nice solution.

I can produce a 1500 pages doc on a simple machine with 256Mo in a few 
minutes (yes, it swaps) and we use 1 or 2 Go Ram servers for huge documents.

Anyway, we all would welcome some new solution to this problem, but surely 
you reckon there has been loads of workarounb in this list ?

Can you be more specific about the performance threshold you are looking for ?

Regards

Cyril

At 09:02 08/07/2003 +0430, you wrote:
Dear Thomas Sporbeck

It's good to see someone else is using FOP for big reports. I also using
tables for inventory lists near to 600 pages and my user do not accept
to use filters. This FOP is killing my user business and if I could not
find a solution to it, we would trough away the FOP for good, for ever.
Then it would be a shame on FOP open source developers since I would go
and buy none open, commercial product.

I would really appreciate if you inform me of your ideas.

Regards

Ali Farahani

-Original Message-
From: Thomas Sporbeck [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 19, 2003 3:41 PM
To: [EMAIL PROTECTED]
Subject: RE: 0.20.5 release

I would agree to Ricardo. We're using tables for inventory lists
containing about 500 pages. The memory situation in that reports is
really critical and we cannot force the users to set filters.
On the other hand: to us it doesn't matter if this enhancement comes
with 0.20.5 or with a later version (0.20.5a ?), which has of course to
be decided by the developers and will possibly delay refactoring.

Thomas Sporbeck


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



Re: [FW:] RE: 0.20.5 release

2003-07-08 Thread Bertrand Delacretaz
Le Mardi, 8 juil 2003, à 10:14 Europe/Zurich, Thomas Sporbeck a écrit :
...It might be a fundamental decision if FOP is a kind of toolbox 
for developers or if it should be an out of the box-product for 
nearly everyone - I think there's so much good ideas in it that 
everyone should be able to use it
I might be wrong, but I think most users of FOP are using it 
server-side, where resources (especially memory) are more readily 
available. This might explain your problems, I think little energy has 
been spent to optimize FOP's memory requirements.

-Bertrand

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]


Re: [FW:] RE: 0.20.5 release

2003-07-08 Thread Felix Breuer
On Tue, 2003-07-08 at 14:31, Bertrand Delacretaz wrote:
 I might be wrong, but I think most users of FOP are using it 
 server-side, where resources (especially memory) are more readily 

I don't know about most users, but I am using FOP client-side since I do
not have a server.

Felix


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



Re: [FW:] RE: 0.20.5 release

2003-07-08 Thread Thomas Sporbeck
I might be wrong, but I think most users of FOP are using it 
server-side, where resources (especially memory) are more readily 
available. This might explain your problems, I think little energy has 
been spent to optimize FOP's memory requirements.

Yes, I agree. But some companies (banking, assurance etc.) have a quite high level of 
security on their servers - in fact that high that an external administrator has no 
change to install any  piece of software running on a server without some months of 
testing (and that tests may fail because of a memory lack on the workstations...).

Thomas Sporbeck


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



Re: 0.20.5 release

2003-07-08 Thread J.Pietschmann
ali farahani wrote:
It's good to see someone else is using FOP for big reports.
I always wonder what poor souls have to sift through this
huge amound of paper... ;-)
I also using
tables for inventory lists near to 600 pages and my user do not accept
to use filters. This FOP is killing my user business and if I could not
find a solution to it, we would trough away the FOP for good, for ever.
Then it would be a shame on FOP open source developers since I would go
and buy none open, commercial product.
Well, unfortunately my company has tightened my time budget
which means I have to do *all* work on FOP in my spare time.
However, if you have a critical bug to fix and can come up
with a bunch of dollars, I'll gladly take a few days off in
order to fix it (for *everyone*).
In the case of the excessive memory consumption cased by
tables I think I have found a fix which wont break everything
else. It will certainly require some amount of testing, which
means another release candidate and which is therefore quite
unpopular with our release manager.
J.Pietschmann

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]


Re: [FW:] RE: 0.20.5 release

2003-07-08 Thread J.Pietschmann
Thomas Sporbeck wrote:
 It might be a fundamental decision if FOP is a kind of toolbox for
developers or if it should be an out of the box-product for nearly everyone
It is Open Source. If you find issues and create patches, send
them in. Every contribution is welcome.
J.Pietschmann



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]


RE: 0.20.5 release

2003-07-07 Thread ali farahani
Dear Thomas Sporbeck

It's good to see someone else is using FOP for big reports. I also using
tables for inventory lists near to 600 pages and my user do not accept
to use filters. This FOP is killing my user business and if I could not
find a solution to it, we would trough away the FOP for good, for ever.
Then it would be a shame on FOP open source developers since I would go
and buy none open, commercial product.

I would really appreciate if you inform me of your ideas.

Regards

Ali Farahani

-Original Message-
From: Thomas Sporbeck [mailto:[EMAIL PROTECTED] 
Sent: Thursday, June 19, 2003 3:41 PM
To: [EMAIL PROTECTED]
Subject: RE: 0.20.5 release

I would agree to Ricardo. We're using tables for inventory lists
containing about 500 pages. The memory situation in that reports is
really critical and we cannot force the users to set filters.
On the other hand: to us it doesn't matter if this enhancement comes
with 0.20.5 or with a later version (0.20.5a ?), which has of course to
be decided by the developers and will possibly delay refactoring.

Thomas Sporbeck


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



RE: 0.20.5 release

2003-06-19 Thread Ricardo Amador
Hi,

Sorry to drop in... Just ignore me if you don't see any relevance.
In any case, don't bother answering me.

Considering that tables are currently the only mean to control
pagination, all my documents have a tendency to include lots of tables
(and they all start with a TOC). I believe I'm not alone. I would say
that any improvement in the memory usage associated with tables, IMHO,
is kind of critical.

BTW, there are currently 8 proposed patches in Bugzilla. Most of them
look to me quite simple and inoquous, and 4 of them are marked as
Enhamcements for version 0.20.5 (there is also an older one for version
0.20.3). It would be nice if one of you guys could take a look at those
patches and consider them before issuing the final 0.20.5 release.

Congratulations to you all on an excellent job, you are doing,
Ricardo Amador

-Original Message-
From: Christian Geisert [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, June 17, 2003 6:16 PM
To: [EMAIL PROTECTED]
Subject: 0.20.5 release


Ok,

RC3a seems to be rather stable and the changes since then look
non-critical to me. What about doing the release now (read: next days)
(and maybe 0.20.5a later if we get more hyphenation patterns back)

Or should we make the changes proposed by Jörg (improved memory usage
with tables - see 
http://marc.theaimsgroup.com/?l=fop-devm=105399053227758 ) which would
require another release candidate.

Comments please!

Christian





-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



RE: 0.20.5 release

2003-06-19 Thread Thomas Sporbeck
I would agree to Ricardo. We're using tables for inventory lists
containing about 500 pages. The memory situation in that reports is
really critical and we cannot force the users to set filters.
On the other hand: to us it doesn't matter if this enhancement comes
with 0.20.5 or with a later version (0.20.5a ?), which has of course to
be decided by the developers and will possibly delay refactoring.

Thomas Sporbeck


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



Re: 0.20.5 release

2003-06-17 Thread Jeremias Maerki

On 17.06.2003 19:16:23 Christian Geisert wrote:
 RC3a seems to be rather stable and the changes since then look
 non-critical to me. What about doing the release now (read: next days)

+1

 (and maybe 0.20.5a later if we get more hyphenation patterns back)

Don't count on that. :-(

 Or should we make the changes proposed by Jörg (improved memory
 usage with tables - see 
 http://marc.theaimsgroup.com/?l=fop-devm=105399053227758 )
 which would require another release candidate.

Still -0.


Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]



Re: 0.20.5 release

2003-06-17 Thread J.Pietschmann
Christian Geisert wrote:
RC3a seems to be rather stable and the changes since then look
non-critical to me. What about doing the release now (read: next days)
(and maybe 0.20.5a later if we get more hyphenation patterns back)
Or should we make the changes proposed by Jörg (improved memory
usage with tables - see 
http://marc.theaimsgroup.com/?l=fop-devm=105399053227758 )
which would require another release candidate.
I'd rather close the book on the maintenance code so that
something could be done on HEAD.
Take the footnote space problem I analysed yesterday: while
I know quite precisely what went wrong (two different
approaches to account for footnote space working concurrently)
I have no idea what breaks if I attempt a fix.
While HEAD has a lot of technical details to fix, the overall
approach seams to be much more promising.
J.Pietschmann

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]