On Jun 25, 2012, at 8:17 AM, David Herman wrote:

> Implicit coercions suck, and IMO it would be better if destructuring didn't 
> add new ones to JS. In the current draft spec, I believe if you match an 
> object destructuring pattern against a non-object or an object that doesn't 
> have one or more of the properties, it quietly proceeds -- masking what is 
> quite likely an error:

What it currently does, its

       if value is neither null or undefined, then
             Let obj be ToObject(value)
       Else, let obj be undefined

obj is then the value that is destructured.  This happens recursively at each 
level of the de-structuring.

There are really three separable issues that Dave talks about:
    1)  The ToObject conversion shown above
    2)  The handling of null/undefined shown above.
    3)  The handler of missing properties 
    

1) ToObject conversion is used pervasively throughout the ES semantics  to 
coerce primitive values to object values. It is an important part of the 
illusion that for most purposes Number, String, and Boolean primitive values 
can be treated as-if they were  objects.  

In particular, I don't see any particular reason why it isn't perfectly 
reasonable to say things like:

   let {concat, indexOf, lastIndexOf} = "";  //get some string methods
   let {length,10:char10} = "this is a long string and I don't want to count 
chars";

   let {toLocaleSring: oldToLocale} = 1.0;

There is a tension between fail-fast and the internal consistency of the 
language. In the case of primitive value conversion I don't think we should 
deviate from the existing consistent behavior of the language semantics.  It 
just make it much more difficult for a programmer to develop a conceptual model 
of how the language works, short of memorizing all the special cases.  What is 
the explanation for why:
   let oldToLocale = 1.0.toLocaleString; 
would work, yet 
   let {toLocaleSring: oldToLocale} = 1.0;
doesn't.  The latter feels like somebody made an arbitrary change to the 
general rules that the language follows.

2)  Is certainly debatable as it is a deviation from the ToObject rules that 
are applied in most other contexts of the language.  I believe it is currently 
specified that way because that is what Firefox did.  I don't have any 
particular issue  eliminating that deviation and applying the standard ToObject 
rules which means that:"
    let {a,b} =- null;
would throw.

3) Again we have a internal consistency issue:

   let myFoo = {a:0,b:1}.foo;
vs
   let {foo: myFoo} = {a:0,b:1};

why should one work and the other fail?  The general rule of JS is that 
accessing a missing property returns undefined. Whether or not you think it is 
a good rule, it's the way the core or the language works and it can't change.  
It's also an easy rule for anybody to learn.  Making exceptions to that rule 
just makes the language less internally consistent,  harder to learn, and more 
complex. I don't think we should do it.

> 
>    var { foo, bar } = getFooAndBar();
> 
> If getFooAndBar() doesn't produce an object with foo and bar properties, 
> there's probably something going wrong! And most of the time when you do want 
> to be forgiving (say, in a function signature, or when reading a 
> configuration file), you have a specific default value you want to provide, 
> which can be specified with the default syntax:
> 
>    var { foo = defaultFooValue, bar = defaultBarValue } = getFooAndBar();
> 
> I'd like to propose the following changes:
> 
> (a) Throw when matching null, undefined, booleans, or strings against an 
> object pattern. (I don't propose throwing when matching a string against an 
> array pattern, however, since strings behave structurally like read-only 
> arrays.)

String values should behave consistently in all situations. What's wrong with:
   let {3:char3, length} = str;

> 
> (b) Allow a shorthand "?" prefix for properties in an object destructuring 
> pattern. This is simply shorthand for defaulting to undefined. IOW, the 
> following are all equivalent:
> 
>    var { ?foo } = getFoo();
>    var { foo = undefined } = getFoo();
>    var { foo: foo = undefined } = getFoo();
>    var { ?foo: foo } = getFoo();

But
    var ?foo = getFoo();  would be syntactically illegal!

If we really wanted to address this problem, I think a better approach that 
would have internal consistency would be to allow a ? before a binding 
identifer meaning throw if undefiled.

  let foo = getFoo();  //assigns undefined if that is what is return from 
getFoo()
  let ?foo = getFoo();  //Throws if getFoo() returns undefined.
  let {foo} = getFoo();  //initialized foo to undefined if obj returned from 
getFoo() does not have a foo property or its value is undefined
  let {?foo} = getFoo(); //Throws if obj returned from getFoo() does not have a 
foo property or its value is undefined

I think this would be a better approach.  I'm not particularly excited about 
including it in ES6 without significantly more thought.  It would always be 
added latter.

Allen




> 
> Note, however, that (b) is not strictly necessary to get the fail-soft 
> behavior. You can always write the explicit default. But it makes it 
> considerably more convenient to get the fail-soft behavior when you want it 
> -- only one character more expensive than fail-fast.
> 
> Dave
> 
> _______________________________________________
> es-discuss mailing list
> [email protected]
> https://mail.mozilla.org/listinfo/es-discuss
> 

_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to