On Thu, 7 May 2015, jb wrote:
I am working on a multi-location jitter test (sorry PDV!) and it is
showing a lot of promise. For the purposes of reporting jitter, what
kind of time measurement horizon is acceptable and what is the +/-
output actually based on, statistically ?
For example - is one minute or more of jitter measurements, with the +/-
being the 2rd std deviation, reasonable or is there some generally
accepted definition ?
ping reports an "mdev" which is
SQRT(SUM(RTT*RTT) / N – (SUM(RTT)/N)^2)
but I've seen jitter defined as maximum and minimum RTT around the average
however that seems very influenced by one outlier measurement.
There is no single PDV definition, all of the ones you listed are
perfectly valid.
If you send one packet every 20ms (simulating a g711 voip call with fairly
common characteristics) at the same time as you send other traffic, and
then you present max, 99th percentile, 95th percentile and average pdv, I
think all of those values are valuable. To a novice user, I would probably
choose the 99th and/or 95th percentile PDV value from baseline.
--
Mikael Abrahamsson email: [email protected]
_______________________________________________
Bloat mailing list
[email protected]
https://lists.bufferbloat.net/listinfo/bloat