I remember that. I was also in favor of the same argument. I was mostly asking about GPGPU. Anyhow, I'll simply accept your reargument for the general case and move on. :)
On Thu, Aug 25, 2011 at 10:19 PM, Hugh Fisher <[email protected]> wrote: > Timothy Normand Miller wrote: >> >> How important are denormalized floats in GPUs? For graphics, I would >> expect very little. For GPGPU, it might matter. >> >> The reason I ask pertains to floating point multiplies. If you want >> to multiply by denormalized numbers, then the number of bits you have >> to process may be more. > > I think this came up once before and I'm going to give the same > answer as before because nothing has changed. > > For graphics, denormalized floats are not at all important. Any > really small number rounds to zero. Any really large number > is infinity. No exceptions, no special handling. Forget about > being IEEE compliant: graphics programmers don't want it. Heck, > we don't really care about double precision. > > Just build something that does graphics. GPGPU can be added later. > > -- > Hugh Fisher > _______________________________________________ > Open-graphics mailing list > [email protected] > http://lists.duskglow.com/mailman/listinfo/open-graphics > List service provided by Duskglow Consulting, LLC (www.duskglow.com) > -- Timothy Normand Miller http://www.cse.ohio-state.edu/~millerti Open Graphics Project _______________________________________________ Open-graphics mailing list [email protected] http://lists.duskglow.com/mailman/listinfo/open-graphics List service provided by Duskglow Consulting, LLC (www.duskglow.com)
