At 12:00 PM 3/12/2011, Said Jackson wrote...
The average will approach 0.0 as the number of samples is increased, but not the standard deviation.. The value displayed by their unit is standard deviation.

If you're measuring jitter of an external signal, accuracy is obviously much worse than the internal jitter. But if you're measuring TI, both the instrument and signal jitter get reduced with more measurements, leaving a more accurate TI, doesn't it?

Being designed and sold as a TI analyzer, that seems right. It seems the jitter test is mostly to make sure the unit is operating properly.

If I recall correctly, internal jitter is affected by tweaking the 200 MHz multiplier. Lacking a proper spectrum analyzer, that's the only calibration I have been unable to do.

_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to