Follow-up Comment #12, bug #30363 (project avr-libc): Bill Perry:
>> Backward compatibility: ... >> For the short end, there are a few options: >> 2) Force it to be 1 cycle if it falls to 0 Forcing it to 1 might introduce special handling which I think is unnecessary. >> 3) have another define that sets the MIN cycles so user >> can create some backward combility by defining it to 3 (or 1) >> if necessary. So if this define is not set, it defaults >> to allowing 0 (no delay). Setting MIN cycles does not create complete backward compatibility. For instance, see the following representation: If x represents _tmp with delay cycles builtin, then we get a delay of trunc(x) clock cycles as per new implementation, (trunc(x/4))*4 clock cycles as per old implementation which results in differences in ranges. For e.g., one instance of granularity difference is: In older implementation, for the range in milliseconds [0.004, 0.008), 4 clock cycles of delay is produced In newer implementation, for the range in milliseconds [0.004, 0.005), 4 clock cycles of delay is produced. Introducing a MIN cycles will not address this issue. >> 4) have a define to return to old behavior. My concern is that old behavior is patchy. Any code working by accident cannot be maintained in the long run. It is best advised to make changes earlier than later. Old behavior can be brought back with a macro #ifdef __OLD_DELAY_CYCLE_ which will be deprecated in future version. Thoughts ? _______________________________________________________ Reply to this item at: <http://savannah.nongnu.org/bugs/?30363> _______________________________________________ Message sent via/by Savannah http://savannah.nongnu.org/ _______________________________________________ AVR-libc-dev mailing list AVR-libc-dev@nongnu.org http://lists.nongnu.org/mailman/listinfo/avr-libc-dev