I keep seeing claims that Precision Time Protocol (IEEE 1588-2008) can
achieve sub-microsecond to nanosecond-level synchronization over
ethernet (with the right hardware to be sure).

I've been reading IEEE 1588-2008, and they do talk of one nanosecond,
but that's the standard, and aspirational paper is not practical
hardware running in a realistic system.

I've seen some papers reporting tens to hundreds of nanoseconds average
sync error, but for datasets that might have 100 points, and even then
there are many outliers.

I'm getting PTP questions on this from hopeful system designers.  These
systems already run NTP, and achieve millisecond level sync errors.


Anyway, how much truth is there to all this?  Are there any papers I
should read?


Thanks,

Joe Gwinn

_______________________________________________
questions mailing list
[email protected]
http://lists.ntp.org/listinfo/questions

Reply via email to