Re: OutOfMemoryError by large tables: Alternative solution?

2004-06-10 Thread Thorbjørn Ravn Andersen
Chris Bowditch wrote:
Splitting the table up is the best way to reduce the ratio 
memory:input size. Also, there have been some tweaks made to the 
maintenance code in CVS that help improve the memory usage of tables. 
To use this latest code, you will need to install a CVS client, 
download the 0.20.5 code from the maintainance branch and then 
re-compile. See website for further info:
We found that for a complex 23 page print, the memory usage fell from 
100 Mb to 75 Mb (rough measurement) by using the CVS-version.  The 
overall performance is much more influenced by the memory usage of the 
many pages (when rendering to a PageSet  which we do instead of 
rendering to e.g. a PDF) than by the complexity of the tables.

--
 Thorbjoern Ravn Andersen  ...plus...Tubular Bells!
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: OutOfMemoryError by large tables: Alternative solution?

2004-06-10 Thread Peter B. West
I would like to answer this question now by saying, Alt-design. 
However, I cannot - yet.  The layout work has just begun, and is aiming 
initially at a Java 2D solution.  When that is complete, unless someone 
cares to work in parallel on PDF rendering from the Java 2D 
representation, that PDF rendering work will have to follow.  The 
initial version will require Java 1.4.2, a serious drawback for many, 
and may move forward from there if the text layout facilities improve in 
1.5 and its sub-releases.

Alt-design starts page layout as soon as an fo:flow is encountered, and 
a page is completed as soon as enough of the flow has been read to fill 
that page.

Obviously, forward references cannot be resolved until the reference is 
resolved in the course of layout (e.g. Page n of m), but the existing 
solution of caching the incomplete page to disk will apply in alt-design.

Anyone who wishes to contribute to the development of alt-design would 
be most welcome.  Experience has already shown that ideas from 
alt-design can be very fruitfully applied to the main line of 
development.  In saying this I have no desire to distract those, 
especially those new and productive committers, who are dedicated to 
HEAD development.

I mention this here because of the ongoing concern about memory usage.
Peter
--
Peter B. West http://www.powerup.com.au/~pbwest/resume.html
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: OutOfMemoryError by large tables: Alternative solution?

2004-06-10 Thread Pascal Sancho
Hi,
I wonder weither you can easily read a 1 rows table. Even in a pdf form. 
Probably you should crunch your source file in multiple subtables.
Other engines than FOP could be have similar limitations... Beginning with 
human reader;)

Regards,
Tcho

-Message d'origine-
De : Peter B. West [mailto:[EMAIL PROTECTED] 
Envoyé : jeudi 10 juin 2004 11:32
À : [EMAIL PROTECTED]
Objet : Re: OutOfMemoryError by large tables: Alternative solution?

I would like to answer this question now by saying, Alt-design. 
However, I cannot - yet.  The layout work has just begun, and is aiming 
initially at a Java 2D solution.  When that is complete, unless someone cares 
to work in parallel on PDF rendering from the Java 2D representation, that PDF 
rendering work will have to follow.  The initial version will require Java 
1.4.2, a serious drawback for many, and may move forward from there if the text 
layout facilities improve in
1.5 and its sub-releases.

Alt-design starts page layout as soon as an fo:flow is encountered, and a page 
is completed as soon as enough of the flow has been read to fill that page.

Obviously, forward references cannot be resolved until the reference is 
resolved in the course of layout (e.g. Page n of m), but the existing solution 
of caching the incomplete page to disk will apply in alt-design.

Anyone who wishes to contribute to the development of alt-design would be most 
welcome.  Experience has already shown that ideas from alt-design can be very 
fruitfully applied to the main line of development.  In saying this I have no 
desire to distract those, especially those new and productive committers, who 
are dedicated to HEAD development.

I mention this here because of the ongoing concern about memory usage.

Peter
--
Peter B. West http://www.powerup.com.au/~pbwest/resume.html

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: OutOfMemoryError by large tables: Alternative solution?

2004-06-09 Thread Chris Bowditch
[EMAIL PROTECTED] wrote:

Greetings,
I have a XML file containing data to be formatted as table rows. The number
of rows will exceed 1. As the FOP processes the data the Exception in
thread main java.lang.OutOfMemoryError occurs.
This is a common problem.
As I've seen in the discussion forums, one possibility is to increase the
memory that the java environment will use (the -Xm switch). I have set
it to -Xmx512m. The transformation then succeeds. So whats the problem?
Well, I am not very happy with the possibilty that the XML data file
contains say 2 rows. What then? Increase the memory limit again?
I understand your concerns, but there is no way to generically handle an 
XSL-FO of infinite size. There are some things you can do to reduce the ratio 
memory:input size.

I would like a more robust solution. Has anyone splitted a table? Or is it
better to try some inline formatting or even white-space handling, possibly
using some printf extention from PerlScript.
Splitting the table up is the best way to reduce the ratio memory:input size. 
Also, there have been some tweaks made to the maintenance code in CVS that 
help improve the memory usage of tables. To use this latest code, you will 
need to install a CVS client, download the 0.20.5 code from the maintainance 
branch and then re-compile. See website for further info:

http://xml.apache.org/fop/download.html#source
snip/
Chris

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: OutOfMemoryError by large tables: Alternative solution?

2004-06-09 Thread scott gabelhart




[EMAIL PROTECTED] wrote:

  


Greetings,

I have a XML file containing data to be formatted as table rows. The number
of rows will exceed 1. As the FOP processes the data the "Exception in
thread "main" java.lang.OutOfMemoryError" occurs.

As I've seen in the discussion forums, one possibility is to increase the
memory that the java environment will use (the -Xm switch). I have set
it to -Xmx512m. The transformation then succeeds. So whats the problem?

Well, I am not very happy with the possibilty that the XML data file
contains say 2 "rows". What then? Increase the memory limit again?

I would like a more robust solution. Has anyone splitted a table? Or is it
better to try some inline formatting or even white-space handling, possibly
using some printf extention from PerlScript.

I have created a test XSL and a perl script that creates the XML input. To
create the input, invoke table.pl 2 (or some other number of
iterations). You might has to change the first line of table.pl
#!/usr/local/bin/perl
to something else if your Perl is installed somewhere else.

Thanks in advance.

Anders Malmborg
(See attached file: table.pl)(See attached file: table.xsl)
  

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

XSLFormatter from AntennaHouse