On Wed, Mar 24, 2021 at 2:53 PM Gregory Nutt <spudan...@gmail.com> wrote:
>
>
> > The way the logic in clock_nanosleep() is written, the minimum delay
> > ends up being 2 such ticks. I don't remember why and I can't seem to
> > find it in the code right now, but I know this because I checked into
> > it recently and found out that that's how it works.
>
> See https://cwiki.apache.org/confluence/display/NUTTX/Short+Time+Delays
>
> This is a translation.  It does not effect the accuracy, it effects the
> mean delay.  The accuracy is still 10 MS.  The quantization error will
> lie in the range of 0 to +10 MS.  If you did not add one tick, the error
> would be in the range of -1 to 0 MS which is unacceptable.

Thanks

> > It does not make sense to change the tick interval to a higher
> > resolution (shorter time) because then the OS will spend a
> > significantly increasing amount of time in useless interrupts etc.
>
> Unless you use Tickless mode then it is easy to get very high resolution
> (1 uS range) with no CPU overhead.

Is the Tickless mode considered stable enough for production use now?
IIRC it had some caveats when I last looked into it and I haven't had
a chance to study it again.

Nathan

Reply via email to