> I'll always know what frequency I'm at when I call a delay() function. > > Is there a way of doing: > > _delay_us( double __us, uint32_t f_cpu ) > > and still have the compiler generate code that does not > invoke floating point at run time? > > I could make a delay function for each clock frequency, but > that seems less than optimal.
You could set F_CPU to a "normalized" value, say 10 MHZ, and then scale the calling value to _delay_*() based on the ratio of current speed to normalized speed. Best regards, Stu Bell DataPlay (DPHI, Inc.) _______________________________________________ AVR-GCC-list mailing list AVR-GCC-list@nongnu.org http://lists.nongnu.org/mailman/listinfo/avr-gcc-list