Richard,

I didn't mean to finger you in particular, just trying to quell the massive misinformation that seems to be flying these wires. We are having a violent agreement.

Dave

Richard B. Gilbert wrote:
David L. Mills wrote:

Richard,

Really and truly do believe interpreting stratum as quality of service is a bright red herring. It was never intended for that purpose. Its primary purpose is avoiding timing loops. There is an absolutely wonderful metric with which to interpret quality, a combination of maximum error (synchronization distance) and estimated error (system jitter). There are explicit provisions in the reference clock interface that allow the driver to adjust these values with respect to whatever statistics the clock provides. It would seem the PDoP commonly provided by the GPS receiver firmware would be a prime candidate.

Dave

Richard B. Gilbert wrote:

<snip>


I don't think of stratum as an "estimate" of goodness. I think it's purely a designation of position in the hierarchy. A stratum one server is stratum one because it gets its time from a primary standard; e.g. an atomic clock. A server that gets its time from a WWV receiver is technically stratum one and can be several milliseconds off because of the vagaries of HF radio propagation. The "goodness" of a server also depends on the path through which you receive time from it. A client that is three thousand miles away from a stratum one server and receiving time over a heavily used network is probably getting time that is an order of magnitude poorer than a client three hundred feet away.

Fudging a server to a higher stratum than it would normally have should make it appear less desirable to any client that has a choice of servers.


I hope I didn't create the impression that I was arguing the opposite!!

_______________________________________________
questions mailing list
[email protected]
https://lists.ntp.isc.org/mailman/listinfo/questions

Reply via email to