Time transfer over USB can be improved by timestamping on both ends, then
using a robust estimator for the clock offset.  For example, imagine the
USB is a small microprocessor peripheral.  It has a local timer, freely
incrementing, based on its local clock.  When it gets a USB interrupt from
the host, the timer state is read, and the USB message contains the host
timestamp.

This is enough information for a single-shot clock comparison.  It may be
contaminated with operating-system latency or any number of other host
latencies (bus, cache, etc.).  But generally, with a lightly loaded host,
the USB transaction goes as fast as it possibly can.  A plot of the
clock-pair points will show a heavy line with the best-case transfers and a
smattering of latency events.  A robust estimator will ignore the chaff.

So I don't see a problem with submicrosecond time transfer over USB.  (I
tried this some time back as a quick hack, with a Teensy board.)

Cheers,
Peter
_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to