On 17 Oct, 2011, at 15:14 , Hal Murray wrote: > In article <[email protected]>, > unruh <[email protected]> writes: > >> In fact it is really hard for a computer to keep to 1us accuracy because >> of the delays in interrupt processing. > > It's not the delay that is the problem. It's simple to correct for a > fixed delay. (at least in theory) The problem is variations in the delay. > > Jitter can easily be caused by cache faults or jitter in finishting the > processing of the current instruction, or having interrupts disabled. > (There are probably other sources.)
I actually would have said just the opposite. Jitter is less of a problem because you can see it. That is, if you take a series of samples you can see the variation in delay reflected in the variations of the offsets you compute, and you have some basis for filtering out the samples which are most severely effected (assuming, for example, that those with less delay, and hence a more positive offset, are "better" than those with more delay). Even if you can't eliminate the effect of jitter entirely you at least have the data to develop an expectation of the error residual error it is causing. The problem with a constant delay is that it is invisible. You can't see any sign of it in the data so, while it would be easy to correct this if you knew how big it was, the only way to determine its magnitude is to calibrate it by some entirely independent measurement. In this case it is hard to see how one would do that. To see why that is a bigger problem, note that the 1 us being talked about here must in fact be a measure of jitter, or delay variation, since that is what ntp can see. Assuming the time error must be on the order of 1 us because the jitter is of that order is, however, unjustified because that ignores the effect on your time of the constant delay that ntp can't see, and hence can't even estimate. If the constant latency were in fact on the order of 10 us (that may be unlikely, but without actually determining a number by some means other than ntp there is no way to preclude the possibility) then worrying about the 1 us jitter isn't going to improve the precision of your time much. The fact that you could correct the constant delay by subtracting a constant doesn't help if you have no idea of that constant's value. Dennis Ferguson _______________________________________________ questions mailing list [email protected] http://lists.ntp.org/listinfo/questions
