Bas Laarhoven wrote:
> A question for a guru who has in depth knowledge of the HAL scheduling 
> mechanism:
> 
> I'm developing a driver for a my servo motor controller. The interface 
> is via a PC parallel port. Only one RT thread is used that processes the 
> position feedback from the controller, and calculates the speed signal 
> for the controller. The thread runs every 1 ms and does also contain the 
> PID and motion controllers.
> 
> I've written the HAL file so that all components are listed in order of 
> the data flowing through. Expecting this to give the best results. Thus 
> the thread starts with a call on the driver for feedback values and ends 
> with another call setting the output. Due to some interference, the 
> actual duration of the input and output calls varies a lot (e.g. 
> +100us). If my assumptions are correct, this generates variation 
> (jitter) on the exact moment that the motion controller gets called and 
> this may be undesired behavior.
> 
> Assumption 1. The components get processed on the RT thread in the order 
> these are added to the thread (I think I read that somewhere and it 
> seems logical).
> Assumption 2. The thread is scheduled with a fixed interval, so the 
> first components starts with the least jitter regarding the thread 
> frequency.
> Assumption 3. If one components has been processed, the processing of 
> the next starts immediately and so on, until all components are done.
> 

All assumptions are correct.

> If all these assumptions are correct, the moment that the motion 
> controller gets called will thus vary, depending on variations in 
> processing time of the individual components. As mentioned above, in my 
> situation there is a huge variation in duration of the driver call and 
> I'm wondering if this is either bad, very bad, or not bad at all 
> (because it's somehow compensated for).

The timing of the motion controller call doesn't matter, what matters
are when the driver reads the feedback, and when the driver sends the
new command.  The former is easy - as you pointed out the driver read is
the first thing in the thread.

The motion controller is using data provided by the driver, so it's
perception of the "state of the world" is the state that existed when
the driver captured its inputs.  It doesn't matter when the motion
controller actually runs.

However, the end result of the controller and the PID is a command to
the hardware - the controller and PID are assuming that the output will
be applied "immediately" and be in effect for a full servo period before
then next run of the thread.  That is never actually true - there is
always some delay, and there is always some jitter in that delay.

In most cases, that delay (and jitter) has no meaningful effect on the
machine, especially if you are doing "normal" machining.  If you are
working on the bleeding edge of servo performance (very high speeds,
very high accuracy, very fast acceleration, etc) you might see the effects.

Regards,

John Kasunich


> 
> Do I need to work on the driver, so that it uses a fixed period of time 
> (probably increasing the CPU load), or is this not an issue?
> 
> -- Bas
> 
> 
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Microsoft
> Defy all challenges. Microsoft(R) Visual Studio 2008.
> http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
> _______________________________________________
> Emc-developers mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/emc-developers


-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
Emc-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/emc-developers

Reply via email to