Hi everyone, I'm new to the list, and have been reading the recent threads on Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with interest.
As I understand it, there are a couple of reasons why one needs a time-interval / phase measurement implemented outside the MCU: 1) Time resolution inside the MCU is limited by its clock period, which is much too coarse. The GPSDO would ping-pong within a huge dead zone. 2) Software tends to inject non-determinism into the timing. Are there others? I have no background or experience with PLLs/DLLs, so I'm really just feeling my way blindly here. That being said, I find myself wondering as follows: Suppose that we count OCXO cycles (at, say, 10 MHz) using one of the MCU's timer/counter peripherals, and periodically sample the counter value with an interrupt triggered on the rising edge of the GPS 1pps. Assume that this interrupt is the highest priority in the system, so that our measurement is fully deterministic, having only the +/- one cycle ambiguity inherent in the counting. Also assume that we keep the counter running continuously. At this point the time measurement is quite crude, with 100-ns resolution. But because we keep the counter running, the unknown residuals will keep accumulating, and we should be able to average out this "quantization noise" in the long run. That is, we can measure any T-second period to within 100 ns, so the resolution on a per-second basis becomes 100 ns / T. Is there any reason why this sort of processing cannot attain equivalent performance to the more conventional analog phase-detection approach? Thanks, Mark KJ6PC _______________________________________________ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.