On Sunday, 29 November 2015 at 06:18:13 UTC, Jonny wrote:
um, come on, you sit here and preach that I don't know what I'm talking about yet you can't even be right on the first sentence?

jitter is the standard deviation of the timings. Do you know what standard deviation is? It is the square root of the sum of the squares...

Jitter is any deviation in, or displacement of, the signal pulses in a high-frequency digital signal. The deviation can be in terms of amplitude, phase timing or the width of the signal pulse.

The units of jitter measurement are picoseconds peak-to-peak (ps p-p), rms, and percent of the unit interval (UI).

See google.

Reply via email to