At 01:51 PM 4/18/2011, unruh wrote...
Since you can measure the time to usec, in 1 sec you can measure rate
offsets of 1PPM and offsets of 1usec.

That's the thing. You can't do that.

It's not a matter of time precision on a single machine. It's a matter of comparing times on two or more machines. Network jitter is unpredictable, especially if the hosts NTP is syncing with are remote, so single readings can't be trusted to the us (a full size Ethernet frame @ 1Gbps is ~12 us), or even the ms level. There can also be jitter within the host you're syncing to (irq latency, etc.). Using a longer time constant averages out the jitter. I'm sure there are papers covering the math behind it, somewhere.




_______________________________________________
questions mailing list
[email protected]
http://lists.ntp.org/listinfo/questions

Reply via email to