Hi Alexandre,

Ok, we are talking about a data-set that's up to 8Mb, with up to 800Kb 
mutations every second. That's a lot of data!
First thing that strikes me, is that you are sending every second. If you 
are sending the last 5 seconds every second, your'e basically sending 4 
seconds of duplicates every second.
Those duplicates need to processed too, every second again. Loose this 
overhead! Best thing you can do for performance is drop down to a 4 or 5 
second interval for getting your
data.
The data is in the form of an array I suppose, is every row an array or an 
object?
Move this data to a service, you can handle the updates in the service too.
You don't want to traverse 30.000 rows if you have 300 to 3000 new/changed 
rows. Let me do the math for you,
on 30.000 rows with 1% changes you would have 9.000.000 iterations.
You need to create an index to your data, so you can update it easily, with 
just 1 iteration of the new data!
if you have splitted out your additions and your changes, this is easier to 
do.

Angular will hand you some tools to help with this, but a large part of 
your problem isn't angular specific.  Just some good old programming 
skills, and knowledge of database handling is needed!

Regards
Sander





-- 
You received this message because you are subscribed to the Google Groups 
"AngularJS" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/angular.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to