On 11/15/10 7:46 PM, Aryeh Gregor wrote:
On Mon, Nov 15, 2010 at 1:19 AM, Boris Zbarsky<[email protected]> wrote:2) Casting an array of integers to an array of bytes will give you different results on different hardware. For example, the integer 0xffff0080 when viewed as imagedata bytes is either rgba(255, 255, 0, 0.5) (half-opaque yellow) or rgba(128, 0, 255, 1) (fully opaque purplish blue), depending on your endianness.That's evil.
It's compatible with how GL works, though, as I understand.
Isn't JavaScript meant to conceal machine details like endianness?
Well, usually yes. ;)
Couldn't we mandate that the conversion here must be little-endian?
What does that mean in practice? That every non-byte read from a typed array needs to swap byte order as needed? I guess that's doable; mail the authors of https://cvs.khronos.org/svn/repos/registry/trunk/public/webgl/doc/spec/TypedArray-spec.html ?
Granted that it'd be slower on ARM and such
Only some ARM. Other ARM is little-endian. It depends on the motherboard, and on some ARM devices may be a boot-time option, as I understand.
Or has this already become such a big and general problem
It's only a problem for this particular API which is all about raw memory access. I mean... this is an API that lets you take an array of doubles and then examine the bytes.
-Boris
