I'm using Warp10 to store data - it's at per-second resolution, but to keep 
data sensible, it's being stored by exception - meaning that recordings are 
only made when values change.

The trouble is that I'm then having issues with retrieving data because of 
the way various WarpScript functions work.

I know I can use the mapped version of FETCH to include pre and 
post-boundary items (thank goodness!), which allows me to get the data I 
need, but I'm discovering issues when it comes time to BUCKETIZE.

When I BUCKETIZE my sparse data, I get results that don't represent the 
real world conditions, unless I'm extraordinarily lucky with my timing. In 
the following situation...

1:00  20
1:20  20
1:21  30
2:30  30
2:31  20
2:40  20

...into 5-minute mean increments, more often than not I'll end up with 
this...

1:00  20
1:25  25
2:35  25
2:40  20

Because there's no additional data points in the hour-long gap, the 
bucketizing is very misleading.

Is there any function or combination of functions - something like a 
FILLNEXT - that I can use before I bucketize my GTS? Is there a variant of 
INTERPOLATE I can use that will interpolate based on the first and last 
VALUES of the GTS in my buckets, instead of the calculated bucket total (ie 
instead of interpolating the '25' to '25' in the example above, it would 
take the '30' from the 1:21 sample in the 1:25 bucket, and interpolate it 
out to the '30' in the first sample in the 2:35 bucket?

This last solution would be the absolute ideal - it would perform a much 
more accurate variant of BUCKETIZE on sparse data. 

-- 
You received this message because you are subscribed to the Google Groups "Warp 
10 users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/warp10-users/1ea1a8f5-5637-44ba-8373-4c21003eb4f5%40googlegroups.com.

Reply via email to