Timothy Normand Miller wrote:
How important are denormalized floats in GPUs? For graphics, I would
expect very little. For GPGPU, it might matter.
The reason I ask pertains to floating point multiplies. If you want
to multiply by denormalized numbers, then the number of bits you have
to process may be more.
I think this came up once before and I'm going to give the same
answer as before because nothing has changed.
For graphics, denormalized floats are not at all important. Any
really small number rounds to zero. Any really large number
is infinity. No exceptions, no special handling. Forget about
being IEEE compliant: graphics programmers don't want it. Heck,
we don't really care about double precision.
Just build something that does graphics. GPGPU can be added later.
--
Hugh Fisher
_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)