I have a large database where a user could select a range of data that 
could return millions if datapoints in X. (time). What would the best way 
to handle this? I'm thinking along the lines of averaging chunks down to 
the number of pixels the chart is wide (i.e, a chart 1000 pixels across 
that is plotting 1 years worth of data, I'd average the data into 1000, 
1/1000'th of a year data points).

Any advise?

-- 
You received this message because you are subscribed to the Google Groups 
"Google Visualization API" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/google-visualization-api.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-visualization-api/cb114fd3-f4eb-4e5d-a8a8-d6958c1ecfe4%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
  • [visualization-api] Suggestions... 'Tom Cumming' via Google Visualization API

Reply via email to