> > Look at the measurement example from V1.1 of RTL. It shows a possible
> > technique for reducing jitter down to a couple of microseconds.
> 
> Not a chance, again a pure illusion. With that one you can reduce the
> average jitter but not the maximum. In fact while you are running that
> ... skipped...
> However if you need no jitter on a PC that has the most powerfull
> Pentium cpu, and are willing to loose a lot of performances, an idea
> pointed to me by Tomasz Motyleski can be the right one: interrupt at a
> time before the needed one, according to the max jitter you have
> measured, keeping the interrupts disabled, and keep reading the cpu
> clock (TSC) till the very instant you are waiting for. In that way the
> jitter is less than 1 us. After Tomasz's suggestion I tried it under
> RTAI, it works nicely but sucks a lot.
> 

If you had actually took the trouble to look at the  code  I was
referring to , you would see that it does exactly that.

As for "sucks", I think most people are willing to sacrifice some CPU time
for getting stable periodicity.

Michael.
--- [rtl] ---
To unsubscribe:
echo "unsubscribe rtl" | mail [EMAIL PROTECTED] OR
echo "unsubscribe rtl <Your_email>" | mail [EMAIL PROTECTED]
----
For more information on Real-Time Linux see:
http://www.rtlinux.org/~rtlinux/

Reply via email to