Has anyone studied jitter as a function of frequency? I think
Don Gaffney did some serious jitter studies at one time ...
I have a periodic IRQ triggering an RTLinux routine. I am compiling
statistics using one of the DAQ cards onboard 20 MHz counters (taking
some care to compensate for the actual delay incurred by reading the
counters in the first place).
I collect statistics for 1000 cycles (need to increase this) and have
done this at 1,2,3,5,10 kHz. Here are a few observations:
The standard deviation of the jitter does not change much as a function
of frequency (typically 0.5 usec).
However, in general it appears that the worst case jitter appears to
increase with frequency, from about 1.1-1.3 usec at 1 kHz to up to 7 usec at
10 kHz.
Why is this? I have tried to rule out any artifacts relating to the
measurement (although I won't be 100% sure until I have access in a few
months to a decent storage scope). I am wondering if this has to do with
whatever goes on in RTL after the RTprocess has completed the cycle? Or
could this be a result of the system getting less "non-RT" time at 10kHz
(the task takes about 10 us, so in theory at 10 kHz the non-RT stuff still
gets about 90% of the CPU, versus 99% at 1 kHz. My instinct is that this
should not matter)? Any opinions are appreciated.
Thanks.
--
Rob Butera, Postdoctoral Fellow http://mrb.niddk.nih.gov/butera/
Laboratory for Neural Control, NINDS
National Institutes of Health, Bethesda, MD USA
--- [rtl] ---
To unsubscribe:
echo "unsubscribe rtl" | mail [EMAIL PROTECTED] OR
echo "unsubscribe rtl <Your_email>" | mail [EMAIL PROTECTED]
----
For more information on Real-Time Linux see:
http://www.rtlinux.org/~rtlinux/