Oh, I should have said that I'd do the data reduction on the *server* side, 
so that I only shove 1000 data points up to the client.

On Monday, November 27, 2017 at 8:20:10 PM UTC-10, Tom Cumming wrote:
>
> I have a large database where a user could select a range of data that 
> could return millions if datapoints in X. (time). What would the best way 
> to handle this? I'm thinking along the lines of averaging chunks down to 
> the number of pixels the chart is wide (i.e, a chart 1000 pixels across 
> that is plotting 1 years worth of data, I'd average the data into 1000, 
> 1/1000'th of a year data points).
>
> Any advise?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google Visualization API" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/google-visualization-api.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-visualization-api/fe44cb91-3257-4461-b9bb-32455fb61294%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to