Максим Барбул :

My page is actually a CGI/Perl script that generates the XML obtained
from a MySQL query. However, any code pertaining to the Google API
(aka any Javascript) is called within my cgi file via "print qq|
<javascript goes here> |;". How would you recommend for me to thread
such data?

Ray Cromwell:

How would you suggest using setTimeout() intelligently within the
processing of the XML file?

function loadXML() {
     var request = GXmlHttp.create();
     request.open("GET", fileName, true);
     request.onreadystatechange = function() {
         if(request.readyState == 4) {
                var xml = GXml.parse(request.responseText);
                idD = xml.documentElement.getElementsByTagName
("data_id");
                piD = xml.documentElement.getElementsByTagName("pi");
                ciD = xml.documentElement.getElementsByTagName
("cruise");
                dtD = xml.documentElement.getElementsByTagName
("date_time");
                latD = xml.documentElement.getElementsByTagName
("latitude");
                lonD = xml.documentElement.getElementsByTagName
("longitude");
                depD = xml.documentElement.getElementsByTagName
("depth");
                nameD = xml.documentElement.getElementsByTagName
("name");
                valD = xml.documentElement.getElementsByTagName
("value");
                initialize();
        }
    }
    request.send(null);
}

Also, can you perhaps point me to some more information about JSON. I
would love to have a much more efficient page as this is data that I
am trying to give to the scientific community.

I had a chance to look at your product and your screencast. Perhaps a
bit more information about that? Feel free to e-mail me.


------------

Thanks to you both for your answers.

Best,

Dan

On Feb 12, 2:38 pm, Ray Cromwell <[email protected]> wrote:
> You can use setTimeout() to process the XML in chunks, however, I
> would recommend using JSON or emitting script from the server, it will
> be much more efficient. Basically, you can avoid the slow script
> warning by yielding the CPU after processing a certain number of rows
> via setTimeout(). However, you will still face the issue that overall
> page load and startup will be slow.
>
> It also depends on the visualization you plan to use. Most charts
> won't even render more than a few hundred datapoints without choking.
> Chronoscope/Timescope (http://timefire.com) is a highly specialized
> chart specifically engineered to scale to millions of points.
>
> -Ray
>
> On Thu, Feb 12, 2009 at 11:33 AM, Максим Барбул <[email protected]> wrote:
> > Hi, I think you'll have problems in displaying such big data store.
> > Try to split into pages or through out some points. You can threat such data
> > with PHP, PERL, C, Java, ... but NOT JavaScript.
>
> > 2009/2/12 p00kie <[email protected]>
>
> >> Hi all,
>
> >> I am looking for a way to load a large xml file into a dataTable.
>
> >> I am using XMLHttpRequest to load the XML file.
>
> >> My table has 8 columns and about 300,000 rows. When I try to load it,
> >> I will usually get a non-responsive script error.
>
> > --
> > С уважением,
> > Максим Барбул
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google Visualization API" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-visualization-api?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to