I've been trying to figure out how to deal with the following <elementary> 
problem:


   1. I have a time series X(t) where the values are constantly 
   incrementing (jiffies actually).
   2. Consider the new series Y(t) = (X(t-a) - X(t))/a which is the 
   differenced time series
   3. I want to be able to compute the mean of Y(t) over a specified time 
   period
   4. However SELECT mean(value) from X where time > now() - 10m fails with 
   the error - expected field argument in mean.

What is the appropriate way to compute this?

thanks

VK

-- 
Remember to include the version number!
--- 
You received this message because you are subscribed to the Google Groups 
"InfluxData" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/influxdb.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/influxdb/2790f6aa-a1cd-4431-9178-966f04e18c75%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to