[EMAIL PROTECTED] wrote:




Greetings,

I have a XML file containing data to be formatted as table rows. The number
of rows will exceed 10000. As the FOP processes the data the "Exception in
thread "main" java.lang.OutOfMemoryError" occurs.

This is a common problem.


As I've seen in the discussion forums, one possibility is to increase the memory that the java environment will use (the -Xmxxxx switch). I have set it to -Xmx512m. The transformation then succeeds. So whats the problem?

Well, I am not very happy with the possibilty that the XML data file
contains say 20000 "rows". What then? Increase the memory limit again?

I understand your concerns, but there is no way to generically handle an XSL-FO of infinite size. There are some things you can do to reduce the ratio memory:input size.



I would like a more robust solution. Has anyone splitted a table? Or is it better to try some inline formatting or even white-space handling, possibly using some printf extention from PerlScript.

Splitting the table up is the best way to reduce the ratio memory:input size. Also, there have been some tweaks made to the maintenance code in CVS that help improve the memory usage of tables. To use this latest code, you will need to install a CVS client, download the 0.20.5 code from the maintainance branch and then re-compile. See website for further info:


http://xml.apache.org/fop/download.html#source

<snip/>

Chris



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to