On 4/20/07, Steve McKown <[EMAIL PROTECTED]> wrote:

Hi Phil,

On Thursday 19 April 2007 10:24, Philip Levis wrote:
> On Apr 19, 2007, at 7:37 AM, Steve McKown wrote:
> > H
> >
> > In case you're wondering why it's set up this way: converting
> > between powers
> > of 2 needs only fast bit shifts.  Far more efficient than, say,
> > dividing by
> > 1000.
>
> Actually, the reason has more to do with error and the effort/cost it
> would take to handle it. If you want 1kHz rather than 1.024kHz units,
> then you need to convert.  Here's a modified version of a post I sent
> to the EmStar list a while back, when someone suggested just doing
> this ((timer * 1024) / 1000):
>
> The issue is whether you want precise 1024Hz timers or imprecise
> 1000Hz timers. You can get reasonably precise 1000Hz timers, but only
> if you are willing to do a bit of bookkeeping to keep track of
> partial ticks.

This assumes a certain source clock frequency, right?  In one project, we
used
a true 8MHz (8*10^6, not 8*2^20) crystal to get true microsecond
granularity
out of Timer A (SMCLK/8).  The problem is if you want true ms: true ms =
true
us/1000, which is a far more expensive operation than pseudo ms = pseudo
us
>> 10.  This is why I assume (and there's my problem of course...) that
the
original design decided to use 2^10Hz for "milliseconds" and 2^20Hz
for "microseconds" and then selected the clock sources accordingly.

Are we talking from different sides of the same point, or am I still
missing
it (the point)?

All the best,
Steve


It sounds like you're both talking about the same point.   Though, to be
precise according to the TEP, just the interface "Timer<TMilli>" does refer
to true milliseconds, only with potentially big, unspecified error bars.

And as you note, in practice, most TinyOS platforms up to this point more
conveniently measure time with 1/32768 of a second precision, from which
1/1024 of a second easily derives.  So, the keen TinyOS user will use 1024
ticks per second in the timer milli interface -- though this is actually
slightly platform dependent code, because it is absolutely in no way
guaranteed to be true for all platforms.  Partly because, there are many
other sources of variation on a clock to make it deviate from precise 1024
or 1000 ticks per second -- such as temperature, voltage, component
tolerances, etc.  So the expert TinyOS user that *really* cares about true
milliseconds will use some form of time synchronization from a more
reliable, external clock source.

Cory
_______________________________________________
Tinyos-help mailing list
[email protected]
https://mail.millennium.berkeley.edu/cgi-bin/mailman/listinfo/tinyos-help

Reply via email to