Hi -

 

I've read an incredible amount about this and do not think there is
anyway around this problem but I figured I'd send this out there to see
if there were any ideas.  

 

The basic just of the problem is that I am trying to render a report
with tabular data and running out of memory.  The number of rows can
reach 40,000 records.  From what I've read, I understand that the reason
FOP is running out of memory is because the whole table is wrapped in
one page sequence (basically a lot of pages).  

 

Because the fo file for the reports I am generating is generated
dynamically, I was able to batch the table into multiple page sequences
(this removed the memory issues) but this presents a problem because FOP
automatically starts a new page when you end a page sequence.  This can
make the table look choppy because it can be broken with only two
records on a page.  I tried to work in algorithms to automatically
estimate the batch size for a table but this is hard because there is no
real way to estimate the record height in the table or access the
current page number during the rendering process to dynamically break
page sequences.

 

I would really like to find a way to either keep FOP's memory in check
(but I don't think I can do that without multiple page-sequences) or
have a new page-sequence not automatically start a new page.  The
problem with the page sequences is that it starts a new page (so the
table is broken up).  Because the tabular data I am rendering comes in
many shapes and forms this is really not an option.  

 

Any ideas? 

 

Ben.

 

-- 
Ben Wuest
Software Engineer, Development
Q1 Labs Inc - The Nexus of Security and Networking
Office: (506)-462-9117 ext 163 Fax: (506)-459-7016
[email protected] | http://www.q1labs.com <http://www.q1labs.com/> 

 

Reply via email to