On Thu, Jan 28, 2010 at 5:32 PM, Ashley Sheridan

> You could page through the data and make it look like it's happening all in
> the browser with a bit of clever ajax

Ok, good point.

Maybe JSON-transport > javascript parsing just has it's limit at just over
100 meg.

Accepting the fact that it would make the serverside code
much more complex and thus harder to port,
the data would imo need to be shoved into a db, because i doubt the
flat-file storage of $_SESSION would work well.

Anyone know of a piece of code that can pump a 1000-level
deep array of 2 gigabyte into a mysql db and
retrieve subtrees (startID, depth, filters?) efficiently?

Reply via email to