I was playing with an Agilent 53132 counter, and noticed that
it measures "standard deviation" but doesn't seem to offer
what everyone really wants, ie, Allan deviation.  According
to the textbooks, standard deviation won't work for oscillators
because the mean is not fixed and the deviation goes to infinity.

Correct. But the key here is "to infinity". If all you want to
analyze is tens or hundreds of points, and if the oscillator
doesn't drift much during that time span, then the standard
deviation gives you most of what you need to know.

I think this works because in this sampling region it's mostly
white noise and so the textbook fear about non-convergent
statistics doesn't apply in that case.

However, I tried it anyway on a high quality oscillator for
100 measurements of one second each (N=100) and it seemed to
basically work, giving 2E-11 for the deviation.  The drift
over 100 seconds may be small enough that the mean didn't
move significantly.  I have a 53230 on order that does
actually measure Allan deviation, but am trying to get some
work done in the mean time with what I currently have.

Note that the 53132A will bring its own noise into the equation,
regardless of how good your two (DUT and REF) oscillators
are. Looking at a 53132A of mine I get an ADEV limit of about
2.8e-10 / tau in time interval mode.

Can anyone comment on the relationship between the two
types of measurements in the lab?  (We know how they
differ mathematically, but what is the practical implication).

Rick

/tvb


_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to