>This was indeed interesting, I set out to prove that floating point would be >the better choice from a performance perspective, but I can't do that, >atleast not judging from the data from my systems!
in most typical systems, the conversion between integer and FP formats happens once per interrupt. this cost, whatever it is, is *dwarfed* by the cycles used between interrupts for actual processing of that data. this is one of the motivations behind JACK's architecture, where the conversion is guaranteed to happen only once, and you, the programmer of a JACK client, doesn't bother with int<->float conversion anywhere. if you're writing software where you do this conversion more than once per interrupt, i think it probably needs a redesign. if the cost of the conversion is an appreciable fraction of the over cycles/interrupt, then and only then does it seem to make sense to me to consider *not* using float. --p