Hello,

My use case is the following :
* Each two minutes, I have 10 points whose the value is the network latency in 
ms (for example, at 00:00 I have 1 point, at 00:13 another one, at 00:15 
another one....)
* Currently, I aggregate them on 5 minutes with a continuous query and three 
value per point : the min, mean and max value
* My goal is to get the jitter, which means the mean of the absolute difference 
between two successive values

Can you confirm if it's possible and if so, how is it ?

* If I use the non negative derivative transformation which I multiply by 300, 
I get the the jitter between either the mean, the max or the min of my 
aggregated value on 5 minutes, which gives a value far below the "real" jitter 
value
* I tried to use the non negative derivative in continuous queries but of 
course I can't do it without firstly aggregate my data.

I do not see another way to do it, but maybe you have an idea.

Thank you,
Regards,
Grégoire


-- 
Remember to include the version number!
--- 
You received this message because you are subscribed to the Google Groups 
"InfluxData" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/influxdb.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/influxdb/16b1c78d-a118-40ac-a33b-6693248df4ff%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to