--------
In message <[email protected]>, Daniel Mendes writes:

>Can someone explain in very simple terms what this graph means?
>
>My current interpretation is as following:
>
>"For a 100Hz input, if you look to your signal in 0.1s intervals, 
>there“s about 1.0e-11 frequency error on average (RMS average?)"

Close: To a first approximation MVAR is the standard-deviation of
the frequency, as a function of the time-interval you measure the
frequency over.

-- 
Poul-Henning Kamp       | UNIX since Zilog Zeus 3.20
[email protected]         | TCP/IP since RFC 956
FreeBSD committer       | BSD since 4.3-tahoe    
Never attribute to malice what can adequately be explained by incompetence.
_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to