Allen Wirfs-Brock wrote:
On Jun 25, 2012, at 8:17 AM, David Herman wrote:
Implicit coercions suck, and IMO it would be better if destructuring didn't add
new ones to JS. In the current draft spec, I believe if you match an object
destructuring pattern against a non-object or an object that doesn't have one
or more of the properties, it quietly proceeds -- masking what is quite likely
an error:
What it currently does, its
if value is neither null or undefined, then
Let obj be ToObject(value)
Else, let obj be undefined
Does the Else clause lead to obj=undefined being dereferenced resulting
in a TypeError?
obj is then the value that is destructured. This happens recursively at each
level of the de-structuring.
There are really three separable issues that Dave talks about:
1) The ToObject conversion shown above
2) The handling of null/undefined shown above.
3) The handler of missing properties
1) ToObject conversion is used pervasively throughout the ES semantics to
coerce primitive values to object values. It is an important part of the
illusion that for most purposes Number, String, and Boolean primitive values
can be treated as-if they were objects.
In particular, I don't see any particular reason why it isn't perfectly
reasonable to say things like:
let {concat, indexOf, lastIndexOf} = ""; //get some string methods
let {length,10:char10} = "this is a long string and I don't want to count
chars";
let {toLocaleSring: oldToLocale} = 1.0;
There is a tension between fail-fast and the internal consistency of the
language. In the case of primitive value conversion I don't think we should
deviate from the existing consistent behavior of the language semantics. It
just make it much more difficult for a programmer to develop a conceptual model
of how the language works, short of memorizing all the special cases. What is
the explanation for why:
let oldToLocale = 1.0.toLocaleString;
would work, yet
let {toLocaleSring: oldToLocale} = 1.0;
doesn't. The latter feels like somebody made an arbitrary change to the
general rules that the language follows.
Yes, and that's why Lars Hansen specified destructuring (and first
implemented it for array patterns in Opera) the fail-soft way that JS1.7
implements. This was in ES4 days.
The wiki'ed harmony proposal:
http://wiki.ecmascript.org/doku.php?id=harmony:destructuring
specifies that var {foo} = null; and the like throw on T.foo for T = null.
On your point that by making destructuring not use ToObject, rather
throw on any primitive RHS, we would be splitting the language between
pre-ES6 "convert implicitly" and ES6 "require explicit ToObject" worlds.
I agree this is a split that we should consider carefully, not just
dogmatically assert "EIBTI" and forge ahead. Still, I'm not sure it's
worse to keep the implicit ToObject given the new syntax. I used to feel
more strongly about it, but less so as time has worn on.
Vesper Lynd (to James Bond): "You've got a choice, you know. Just
because you've done something doesn't mean you have to keep doing it."
("Casino Royale", 2006)
(Not that I am comparing JS's implicit conversions to killing people!)
2) Is certainly debatable as it is a deviation from the ToObject rules that
are applied in most other contexts of the language. I believe it is currently
specified that way because that is what Firefox did.
No:
js> var {foo, bar} = null
typein:1: TypeError: null has no properties
But I'm unclear on the purpose of the ES6 draft's If/Else you show
(cited first above). Why shouldn't we just use ToObject? That is what
SpiderMonkey has done lo these many years (2006, IIRC).
I don't have any particular issue eliminating that deviation and applying the
standard ToObject rules which means that:"
let {a,b} =- null;
would throw.
The deviation is solely in ES6 (if anywhere; doesn't obj in the spec get
set to undefined by the Else clause, and then undefined.P or
undefined[I] evaluated, throwing?). It certainly should be fixed.
3) Again we have a internal consistency issue:
let myFoo = {a:0,b:1}.foo;
vs
let {foo: myFoo} = {a:0,b:1};
why should one work and the other fail?
Lars' design intention, exactly.
Here I just wrote in reply that I could go along with throwing, provided
we add ?foo. But that was because I know Mozilla-targeted JS that counts
on pulling out undefined. It's not a bug to count on this, real code
does it and I suspect there's Opera-specific code doing the same with
array patterns.
So my concern was making the fix for such code to work with a
throw-on-missing-property destructuring standard, if we create one, be
as short and simple to patch as possible.
But perhaps I was wrong to conceded this point too. It's another example
of splitting destructuring from structuring (object literals composed
with . as you show) and with "pre-ES6" JS. We should consider the split,
in this specific form and in general, more carefully -- I mean, with
more discussion and ideally some evidence that imputing undefined rather
than throwing is better at fighting bugs and not just frustrating
"optional" destructuring patterns.
The general rule of JS is that accessing a missing property returns
undefined. Whether or not you think it is a good rule, it's the way the core or
the language works and it can't change. It's also an easy rule for anybody to
learn. Making exceptions to that rule just makes the language less internally
consistent, harder to learn, and more complex. I don't think we should do it.
Ok, we should discuss this at the July meeting. Helpful if we get some
evidence-based adductive arguments here first.
(b) Allow a shorthand "?" prefix for properties in an object destructuring
pattern. This is simply shorthand for defaulting to undefined. IOW, the following are all
equivalent:
var { ?foo } = getFoo();
var { foo = undefined } = getFoo();
var { foo: foo = undefined } = getFoo();
var { ?foo: foo } = getFoo();
But
var ?foo = getFoo(); would be syntactically illegal!
Yes, the ? prefix would be only inside array and object patterns. That's
yet more "inconsistency" by one way of measuring consistency.
If we really wanted to address this problem, I think a better approach that
would have internal consistency would be to allow a ? before a binding
identifer meaning throw if undefiled.
LOL, "undefined" (which some might say is defiled ;-).
let foo = getFoo(); //assigns undefined if that is what is return from
getFoo()
let ?foo = getFoo(); //Throws if getFoo() returns undefined.
let {foo} = getFoo(); //initialized foo to undefined if obj returned from
getFoo() does not have a foo property or its value is undefined
let {?foo} = getFoo(); //Throws if obj returned from getFoo() does not have
a foo property or its value is undefined
The sense of ? as "maybe" is backwards here, though. The typical trope
when insisting on something "succeeding" is ! not ?:
let !foo = getFoo(); // throws if getFoo() returns undefined
But you've talked me out of the ? shorthand, so I am going to back off
to not supporting Dave's proposal.
The fundamental question is: should we add new forms that are similar to
(or duals of) existing forms but that throw on missing properties, where
existing forms impute undefined?
/be
_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss