Consider the following snippet, along with the spec for
Array.prototype.unshift (
https://people.mozilla.org/~jorendorff/es6-draft.html#sec-array.prototype.unshift
):

var arr = [];
arr[0xfffffffe] = 10;
try {
  arr.unshift(1);
} catch(e) {
  // e is a RangeError, since the unshift operation will try to set the
length to 0xffffffff
}

what should the state of arr be after this botched operation? The way the
spec is written, all the element manipulation happens before the RangeError
occurs, so arr[0xffffffff] should be 10 and arr[0xfffffffe] should have
been deleted.

But it's not clear this is worthwhile. I tested this in V8, SpiderMonkey,
and JSC.

V8 properly throws a range error, but fails to move the element up one
index.
SpiderMonkey hangs (presumably because it has no special logic to deal with
very large, sparse arrays).
JSC throws an "Out of memory" Error and also fails to move the element up
one index.

A similar thing happens if one attempts to splice items into the array.

Is there any reason not to specify some precondition checking in these
algorithms and spec them to throw the RangeError without interacting with
the elements of the array if it's known a priori that the resulting length
will be > 2^32-1?

In V8 and JSC this nearly matches existing behavior.  In Firefox it would
cause a change in behavior, but any code depending on the existing behavior
would be hanging for a long time already.

- Adam
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to