Yes I agree that we need to do some measures to see the benefits of floats.
But it's not just about the performance and memory gain of the floats, I would like to make ARM SIMD optimizations for the transformation functions to fasten up them, but ARM NEON technology only works with floats that's why I want to change the doubles. I would prefer a typedef solution to make this configurable as Ryosuke said, and then the ports could chose whether they need floats, doubles or long doubles.

- Gabor
On Oct 10, 2012, at 9:00 AM, Gabor Rapcsanyi <rga...@inf.u-szeged.hu> wrote:

That was a long time ago and there were no objections
Unless there's something in the spec requiring double precision it makes sense 
to move away from double precision throughout WebKit.
I’m a little concerned about this.

The programming interface to the graphics system on OS X uses doubles, so it’s 
likely we’ll be introducing float to double conversions for at least that 
platform if we change things inside WebKit that are currently doubles to float 
en masse.

Similarly, the native data type for numbers in JavaScript is double. So any 
time we are passing a number from JavaScript to WebKit it’s likely that 
additional double to float conversions will be required as we change internals 
to use doubles instead of floats.

On the other hand, there may also be performance boost or memory gains from 
using float instead of double.

If we do make this change, I think we need to do performance testing as we do 
so, perhaps on more than one platform, to see if these changes are leading to 
benefits or not.

I also think that “we should always use float unless a spec mandates double” is 
probably not the right rule of thumb. There are many factors other than “a 
spec” at play here.

-- Darin

_______________________________________________
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo/webkit-dev

Reply via email to