Does anyone have a code sample to remove all elements and attributes from a 
TD tag?  I think my problem with memory usage is that I'm creating a new TD 
for every column in every row.  I had hoped to instantiate a single TD 
object and then clear it out and reuse it each time I insert data into the 
table.  I've tried this code but it doesn't actually remove the element for 
some reason.  Infinite loop.  Any help?

{infinite loop}
TD tCol = new TD(true);
while(tCol.elements().hasMoreElements() == true) {
  sSql = tCol.elements().nextElement().toString();
  tCol.removeElement(sSql);
}

This code never ends because removeElement doesn't seem to be removing the 
element.  Any ideas on what I'm doing wrong.
                        

-----Original Message-----
From:   Stephan Nagy [SMTP:[EMAIL PROTECTED]]
Sent:   Friday, January 28, 2000 9:57 AM
To:     ECS
Subject:        Re: Streaming a large table

Mike Wilson wrote:

> New question if I may ask the group...
>
> I have built a servlet that returns n rows from a query that is hitting a
> rather large table on my Oracle server.  I'm running out of memory in the
> routine that pumps that data into a <table> to display in the browser 
using
> ECS.  I can get about 1500 rows out to the browser in one shot but 
anything
> over that dumps with a OutOfMemory exception from the JVM on my Apache
> server.  The question is: Is there a way using ECS to "stream" out a 
table
> object rather than build the entire page in memory and then send it to 
the
> browser.

I don't think this can be done since ECS uses recursion to do its output. 
 The
object has to be complete before you can recurse through it.

> I know the problem with this doesn't really lay with ECS but I
> was curious if there was a way to send the page to the browser and then
> complete the table, body, and html tags after the data had be flushed to
> the browser.  Has anyone done this?  Does anyone know if it's possible?
>  Thanks in advance for any assistance.

I'm generating some rather large pages (  85,000+ elements with 100,000+
attributes. )  And the only way I've found to resolve the OutOfMemory error 
is
to bump of my  -Xmx  -Xms settings.  I'm using XML->XSL->FORCE Paradigm 
that
eats alot of memory.

FORCE is a project that I will be releasing to the community fairly soon. 
 It
does FO->HTML and FO->RTF transformations based on stylesheets using ECS to 
do
the document creation.  So not only do I have the overhead of ECS to deal 
with
I have the overhead of DOM as well.  I checked in some memory stuff into 
cvs
that reduces the amount of memory that ECS uses, and they were part of the
last release so you might want to pull down a new version or grab what is 
in
cvs to help out with your memory problem.  I've found that with DOM and ECS 
to
create a 300 page rtf report ( all tables - several 1000 rows 5-10 columns 
-)
which translates to about a 3meg html file I need to set my max heap to 
164meg
( granted thats to accomadate both DOM and ECS).

-stephan



--
------------------------------------------------------------
To subscribe:        [EMAIL PROTECTED]
To unsubscribe:      [EMAIL PROTECTED]
Archives and Other:  <http://java.apache.org/main/mail.html>
Problems?:           [EMAIL PROTECTED]


--
------------------------------------------------------------
To subscribe:        [EMAIL PROTECTED]
To unsubscribe:      [EMAIL PROTECTED]
Archives and Other:  <http://java.apache.org/main/mail.html>
Problems?:           [EMAIL PROTECTED]

Reply via email to