Hi...
In the mean time I was able to speed it up a lot! But I'm still
looking for more... :)
I'm testing with a list of 12134 rows and I'm loading them in batches
of 1000 rows (13 batches in total).
The changes I made:
- I was scrolling to the bottom of the page each time I batch was
loaded (document.documentElement.scrollTop =
document.documentElement.scrollHeight;)
This appeared to take a big part of the cake! The reason I scrolled
down, was to somehow show the user that something is happening. Now I
have a floating DIV which shows the user some information about the
progress.
- After every batch is loaded from server, I need to retrieve the
'tbody' of the table that's loaded already. I was using something
like: $(containerID).down('table tbody');
Changing it to $(containerID).down().down().down() was a great spead
up.
- The problem I had with receiving in each batch the header of the
table, I fixed by not rendering the header. Simple :)
- A new batch I can now append to the 'old container' using something
like:
var newContainer = new Element('div', {id:'tmpContainer'});
newContainer.innerHTML = result.controls[controlID];
var newTBody = newContainer.down().down().down(); // similar
optimization as retrieving it using newContainer.down('table tbody')
oldContainer.insert(newTBody.innerHTML);
delete newContainer;
(much much faster then looping through every row)
In the first version of the code I started with loading batches at
around 3 seconds.. gradually growing to +- 14 seconds (!!) for the
last full batch (batch nr 12).
Now the retrieving and rendering is quite stable in FF (around 3 - 3.5
seconds).
I'm looking for more optimizations, because on slower computers it's
still (very) bad. I have a few ideas thx to firebug...
On Nov 21, 10:28 am, TweeZz <[EMAIL PROTECTED]>
wrote:
> Hi...
>
> I got some code running correct, but I'm quite sure it can be written
> in a way that it performs better...
> I implemented a system that loads rows of a table in batches (using
> Anthem ajax). It will be used for print reports with a big amount of
> rows.
>
> On the initial load of the page, the first batch is displayed (let's
> say 100 data rows + 1 header row). Then I make a callback to retrieve
> the next 100 rows. The problem is that each time I also get the header
> row of the table.
> The header row has a class name 'HeaderRow'. The data rows have 'Row'
> as classname.
> This is the code I'm using now:
>
> var newContainer = new Element('div');
> newContainer.innerHTML = result.controls[controlID]; // this contains
> a html table with the header row and data rows
> newContainer.select('table.grid tr.Row').each(function(row)
> {
> oldContainer.insert(row);});
>
> delete newContainer;
>
> The part that I would like to optimize is the 'select' of the data
> rows and the looping through each of them to insert them at the bottom
> of the datarows already on the page.
>
> I would be happy to hear if and how this can be optimized...
> With kind regards,
>
> Manu aka TweeZz.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby
on Rails: Spinoffs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/rubyonrails-spinoffs?hl=en
-~----------~----~----~----~------~----~------~--~---