Hi everyone, I think everyone agrees that if a relative timer is set, it is expected to run *at least* the amount of time that is specified as interval to timer_set().
That means if now() is e.g., 0 ticks and the timer is set to wait 1 tick, the timer must trigger at 2 ticks, as even if now()==0, the actual ("real") time is anywhere between ticks 0 and 1. If ticks where 1ms each and now_us() is actually 999 (one us before now()==1), triggering at now(1) would trigger only 1us later and not 1ms. Thus it needs to trigger at 2 ticks (2ms), in order to have the timer wait "at least 1ms". Correct so far? Now for the case where a 1ms timer is using a 1us timer as backend. If a 1ms timer is set at now_ms()==0 and now_us()=500 to trigger after 1ms, it would be possible to trigger the timer at now_ms()==1 and now_us()==1500, thus at a "half_tick" of now_ms(). Should that be done, or should the conversion be implemented in a way that the 1ms timer behaves the same regardless of the lower timer having 1ms or 1us precision? Kaspar _______________________________________________ devel mailing list devel@riot-os.org https://lists.riot-os.org/mailman/listinfo/devel