Bas Laarhoven wrote: > > Ah, thanks John, you made it very clear! So in my situation the input > sampling moment is fixed (at the start of the thread) and only the > update of the output varies with the variations in processing time. You > state this will probably have no noticeable effect, but as this > variation is a significant part of the PID cycle time, wouldn't it be > better to reduce the variation (noise) and work with a fixed although > increased delay? Another option is to lower the loop update frequency as > that reduces the noise in more than one way. Do you know of a rule of > thumb for the cycle time in relation to the system response time ? The "pros" usually have a setup so that a hardware clock or interrupt timer both samples the encoder position and updates the DAC velocity values at the same instant. The velocity output is thus delayed until the NEXT servo cycle, but the timing has very much reduced jitter. You might be able to obtain similar result by holding your velocity update back in the driver, and writing it to the device registers just after reading the encoder data. Then, the computation would follow these actions, and the result would be held until the next servo cycle.
Jon ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2008. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ Emc-developers mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/emc-developers
