Re: An array destructing specification choice

2011-11-07 Thread Lasse Reichstein
On Sat, Nov 5, 2011 at 10:41 PM, Brendan Eich bren...@mozilla.com wrote:
 We have:

 1. Should an array pattern always query 'length'?

 2. If the answer to (1) is no, then should ... in an array pattern query 
 'length'?

 On reflection and at this point in the thread, with your reply in mind, my 
 prefs in order: [no, yes], [no, no]. In no case do I favor [yes]. I'm 
 refutably matching [no, _] :-P.

My initial intuition was [no, ?], since that was without considering
rest-matching. With rest-matching, and me being a sucker for
consistency, I'm now leaning towards [yes, yes].

The possibilities are
 [no, no]
 [no, yes]
 [yes, no]
(since [yes, no] makes absolutely no sense).

[no, no] is definitly possible. It needs to be defined which
properties are included by the ... in, say [x,y,...r], but since the
result in z must be an array, it would seem that any array index
property of the RHS where you can subtract 2 and still be an array
index, is a candidate.

[no, yes] is also possible, but seems inconsistent if
  [x,y,z] = {0:0,1:1,2:2, length:2}
makes z be 2, but
  [x,y,...z] = {0:0,1:1,2:2, length:2}
doesn't make z be [2].

[yes, yes] means always treating the RHS as an array-like object,
respecting the length property.


Both [no, no] and [yes, yes] are consistent. Either the result always
depends on array-like-ness of the RHS or it never does, i.e., either
it reads the length and converts it to an UInt32 (or Integer), or it
doesn't.

If the object being destructured is in fact a plain Array, with no
inherited elements above the length, then there is no difference.
This is most likely the (very) common use case. This is what the ES
programmers' intuition will be based one.

So the question is which behavior we would want for something that
breaks the array-like-contract - treat it as a plain object (ignore
length), or treat it as an array-like object (ignoring properties
above length). We must do one of them, and not both, because they are
mutually exclusive.

Both can cause errors if the programmer does it wrong, if you have an
almost-array-like object. Which is the correct behavior depends on
what was *intended*: to be array-like or not.

The original question was what an ES programmer would expect.
I think he will probably expect array-like deconstructors to treat the
RHS as an array(-like object).
I.e., [yes,yes].

That also have the advantage of actually providing otherwise
unavailable functionality.
You can write {0:x, 1:y, 2:z} instead of [x,y,z] if you want
object-behavior, but if they are the same, you can't get array-like
behavior.

Arrays are just an abstraction in ECMAScript, which all the
Array.prototype methods that are intentionally generic proves. If it
quacks like an Array and swims like an Array, we allow ourselves to
treat it like an Array.

I.e., I think the most easily comprehensible behavior is to make array
destructuring treat the RHS as an Array.
It matches the common use-case (actual arrays), it is consistent (does
the same whether you use ... or not), and is easily explainable.

/L
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Andreas Rossberg
On 5 November 2011 17:44, Brendan Eich bren...@mozilla.com wrote:
 Destructuring is irrefutable in that it desugars to assignments from 
 properties of the RHS. It is not typed; it is not refutable

I don't think that's true, at least not in the usual sense of
irrefutable pattern. Because you can write

  let {x} = 666

which will be refuted, by raising a TypeError.

Of course, the real question is, what does this do:

  let {} = 666

/Andreas
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Andreas Rossberg
On 5 November 2011 19:55, Brendan Eich bren...@mozilla.com wrote:
 On Nov 5, 2011, at 9:38 AM, Allen Wirfs-Brock wrote:

 In a similar vain, what is the value of r in:

 let [z,y,...r] = {0:0, 1:1, 2:2, length: 3, 3:3,4:4};

 should it be [2] or [2,3,4]  (and if the latter how is that determined)?

 The inspiration for ... in the past came from (among other sources) Successor 
 ML:

 http://successor-ml.org/index.php?title=Functional_record_extension_and_row_capture

Since I actually wrote half of that, I feel obliged to say that it
does not answer the questions raised here. ML is a typed language, and
contrary to popular belief, many language design problems are much
easier to solve in a typed setting.

However, there is some inspiration in the way SML treats tuples as
special cases of records, very much like arrays are a special case of
objects in JS. In particular, all of SML's pattern matching rules for
tuples follow just from the way they desugar into records with numeric
labels.

For Harmony, this kind of equivalence would imply that

  let [x, y, z] = e

is simply taken to mean

  let {0: x, 1: y, 2: z} = e

and the rest follows from there. The only problem is rest patterns.
One possible semantics could be treating

  let [x, y, z, ...r] = e

as equivalent to

  let {0: x, 1: y, 2: z, ..._r} = e
  let r = [].slice.call(_r, 3)

where I assume the canonical matching semantics for object rest
patterns that would make _r an ordinary object (not an array)
accumulating all properties of e not explicitly matched (even if e
itself is an array, in which case _r includes a copy of e's length
property). Of course, engines would optimize properly.

(But yes, row capture for objects introduces a form of object cloning,
as Allen points out.)

/Andreas
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Till Schneidereit
 I.e., I think the most easily comprehensible behavior is to make array
 destructuring treat the RHS as an Array.
 It matches the common use-case (actual arrays), it is consistent (does
 the same whether you use ... or not), and is easily explainable.

I agree with the consistency argument. The reason I'm in favor of [no,
no] is that otherwise
[x,y,z] = {0:0, 1:1, 2:2}
would result in
x=undefined,y=undefined,z=undefined

That doesn't seem desirable to me.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


[Proxies] Refactoring prototype climbing in the spec

2011-11-07 Thread Tom Van Cutsem
Hi,

I wrote up an initial (but fairly complete) draft of a proposed refactoring
of the ES5 [[Get]], [[Put]] and [[HasProperty]] algorithms to change the
way in which these algorithms climb the prototype chain: 
http://wiki.ecmascript.org/doku.php?id=strawman:refactoring_put

This is mainly beneficial for proxies, as the prototype walking strategy
becomes observable to proxies-used-as-prototypes.

IMHO, the refactored algorithms interoperate with proxies in a much saner
way, finally restoring the intuitive semantics of the get and set traps
that MarkM and I had in mind from the beginning, but which Sean Eagan
pointed out were flawed given the ES5 spec algorithms.

The biggest change is in the [[Put]] algorithm. For those not into ES spec
language, I wrote up the behavior for ES5 [[Put]] and my proposed ES.next
[[Put]] in JS itself:
ES5 [[Put]]: 
http://code.google.com/p/es-lab/source/browse/trunk/src/es5adapt/setProperty.js#115

Proposed ES.next [[Put]]: 
http://code.google.com/p/es-lab/source/browse/trunk/src/es5adapt/setProperty.js#68


The refactored version also fixes the anomalies resulting from the ES5
[[CanPut]] vs. [[Put]] split that Andreas Rossberg pointed out earlier on
this list.

When I say refactoring here, I really do intend for these new algorithms
to be equivalent to the ES5 algorithms for non-proxy objects. To test
whether these algorithms are indeed equivalent, I wrote up a little
test-suite that runs in the browser: 
http://es-lab.googlecode.com/svn/trunk/src/es5adapt/testSetProperty.html

The results look promising (success on Firefox7, 1 failure on Chrome/Safari
because these allow overriding of non-writable inherited data props,
haven't tested other browsers yet). Still, the more es-discuss eye-balls
that can scrutinize these algorithms, the better.

Cheers,
Tom
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Allen Wirfs-Brock

On Nov 7, 2011, at 2:18 AM, Andreas Rossberg wrote:

 On 5 November 2011 17:44, Brendan Eich bren...@mozilla.com wrote:
 Destructuring is irrefutable in that it desugars to assignments from 
 properties of the RHS. It is not typed; it is not refutable
 
 I don't think that's true, at least not in the usual sense of
 irrefutable pattern. Because you can write
 
  let {x} = 666
 
 which will be refuted, by raising a TypeError.

No,

It does ToObject(666) and then looks for the x property of the resulting 
wrapper object.  Assume it does find one (it could, for example because 
Number.prototype.x = 42, for example) it assigns the value to x.  If it doesn't 
find the property it assigns undefined.  
For 
 let {x=5} = 666;

It would assign 5 to x if the x property of the wrapper was not found,

 
 Of course, the real question is, what does this do:
 
  let {} = 666

It does ToObject(666)



 
 /Andreas
 

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Andreas Rossberg
On 7 November 2011 17:07, Allen Wirfs-Brock al...@wirfs-brock.com wrote:
  let {x} = 666

 which will be refuted, by raising a TypeError.

 No,

 It does ToObject(666) and then looks for the x property of the resulting 
 wrapper object.

Ouch, really?  I don't see that in the proposal
(http://wiki.ecmascript.org/doku.php?id=harmony:destructuring), and to
be honest, it sounds like a horrible idea.  It is just another way to
silently inject an `undefined' that is tedious to track down.  We
already have too many of those...

When would this ever be useful behaviour instead of just obfuscating bugs?

/Andreas
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Andreas Rossberg
On 7 November 2011 17:34, Allen Wirfs-Brock al...@wirfs-brock.com wrote:
  It is just another way to
 silently inject an `undefined' that is tedious to track down.  We
 already have too many of those...

 It is how the language currently behaves in all situations where an object is 
 needed but a primitive values is provided.
 We want consistency in language design, not a hodgepodge of special cases and 
different rules.

Hm, I don't quite buy that. There are plenty of places in ES today
where we don't convert but throw, e.g. in, instanceof, various
methods of Object, etc.  Destructuring arguably is closely related to
operators like in.  Implicit conversion would violate the principle
of least surprise for either, IMHO.

I agree that consistency is a nice goal, but it seems like that train
is long gone for ES. Also, if consistency implies proliferating an
existing design mistake then I'm not sure it should have the highest
priority.


 When would this ever be useful behaviour instead of just obfuscating bugs?

 let {toFixed, toExponential} = 42;

OK, I guess useful is a flexible term. Would you recommend using
that style as a feature?

/Andreas
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Axel Rauschmayer
How about:

let {length} = abc;

I think the conversion keeps the illusion alive that every value in JS is an 
object.

On Nov 7, 2011, at 18:21 , Andreas Rossberg wrote:

 On 7 November 2011 17:34, Allen Wirfs-Brock al...@wirfs-brock.com wrote:
  It is just another way to
 silently inject an `undefined' that is tedious to track down.  We
 already have too many of those...
 
 It is how the language currently behaves in all situations where an object 
 is needed but a primitive values is provided.
  We want consistency in language design, not a hodgepodge of special cases 
 and different rules.
 
 Hm, I don't quite buy that. There are plenty of places in ES today
 where we don't convert but throw, e.g. in, instanceof, various
 methods of Object, etc.  Destructuring arguably is closely related to
 operators like in.  Implicit conversion would violate the principle
 of least surprise for either, IMHO.
 
 I agree that consistency is a nice goal, but it seems like that train
 is long gone for ES. Also, if consistency implies proliferating an
 existing design mistake then I'm not sure it should have the highest
 priority.
 
 
 When would this ever be useful behaviour instead of just obfuscating bugs?
 
 let {toFixed, toExponential} = 42;
 
 OK, I guess useful is a flexible term. Would you recommend using
 that style as a feature?
 
 /Andreas
 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss
 

-- 
Dr. Axel Rauschmayer
a...@rauschma.de

home: rauschma.de
twitter: twitter.com/rauschma
blog: 2ality.com



___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Allen Wirfs-Brock

On Nov 7, 2011, at 9:32 AM, Axel Rauschmayer wrote:

 How about:
 
 let {length} = abc;

or
   let [first,second] = abc;


 
 I think the conversion keeps the illusion alive that every value in JS is an 
 object.
 
 On Nov 7, 2011, at 18:21 , Andreas Rossberg wrote:
 
 On 7 November 2011 17:34, Allen Wirfs-Brock al...@wirfs-brock.com wrote:
  It is just another way to
 silently inject an `undefined' that is tedious to track down.  We
 already have too many of those...
 
 It is how the language currently behaves in all situations where an object 
 is needed but a primitive values is provided.
  We want consistency in language design, not a hodgepodge of special cases 
 and different rules.
 
 Hm, I don't quite buy that. There are plenty of places in ES today
 where we don't convert but throw, e.g. in, instanceof, various
 methods of Object, etc.  Destructuring arguably is closely related to
 operators like in.  Implicit conversion would violate the principle
 of least surprise for either, IMHO.
 
 I agree that consistency is a nice goal, but it seems like that train
 is long gone for ES. Also, if consistency implies proliferating an
 existing design mistake then I'm not sure it should have the highest
 priority.
 
 
 When would this ever be useful behaviour instead of just obfuscating bugs?
 
 let {toFixed, toExponential} = 42;
 
 OK, I guess useful is a flexible term. Would you recommend using
 that style as a feature?
 
 /Andreas
 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss
 
 
 -- 
 Dr. Axel Rauschmayer
 a...@rauschma.de
 
 home: rauschma.de
 twitter: twitter.com/rauschma
 blog: 2ality.com
 
 
 

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Allen Wirfs-Brock

On Nov 7, 2011, at 9:21 AM, Andreas Rossberg wrote:

 On 7 November 2011 17:34, Allen Wirfs-Brock al...@wirfs-brock.com wrote:
  It is just another way to
 silently inject an `undefined' that is tedious to track down.  We
 already have too many of those...
 
 It is how the language currently behaves in all situations where an object 
 is needed but a primitive values is provided.
  We want consistency in language design, not a hodgepodge of special cases 
 and different rules.
 
 Hm, I don't quite buy that. There are plenty of places in ES today
 where we don't convert but throw, e.g. in, instanceof, various
 methods of Object, etc.  Destructuring arguably is closely related to
 operators like in.  Implicit conversion would violate the principle
 of least surprise for either, IMHO.

True, in and instanceof don't follow the rules.  I note that they were 
added in ES3 and I have to wonder if they aren't another case of features being 
added without sufficient thought being given to maintaining consistency of 
behavior throughout the language. I don't know, I wasn't there.

 The same can be said for a few cases in the Object functions that were added 
for ES5.  If I had the same depth of understanding of the internals of the 
language as I do now, I probably would have objected to those variances.

 
 I agree that consistency is a nice goal, but it seems like that train
 is long gone for ES. Also, if consistency implies proliferating an
 existing design mistake then I'm not sure it should have the highest
 priority.

Perhaps not the highest priority, but still a priority.

As the specification writer, I have in my head (yes, it would be good to write 
the down) a set of routine and consistent behaviors that I apply as I compose 
the specification algorithms.  I think this is similar to the conceptual 
understanding of the language that an expert JS programmer uses as they write 
code. Whenever sometime deviates from that norm, it has to be given special 
consideration.  For a new feature, my starting assumption is always that it 
will follow the norm.  Increasing the number of deviations from the norm 
doesn't necessarily make the language better but it certainly makes it less 
internally consistent and harder to reason about.

Whether or not a particular consistent behavior was a design mistake is 
usually a subjective evaluation and I'm not sure  if it is particularly 
relevant.  The core language is what it was and that is what we have to work 
with.  Most such  mistakes can't be pervasively fixed.  It isn't at all clear 
to me that spot fixing only new occurrences of such mistakes makes JS a 
better language.

Allen
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Are some es-discuss submissions automatically blocked?

2011-11-07 Thread David Bruant
Le 06/11/2011 15:37, Axel Rauschmayer a écrit :
 Claus Reinke could not submit his js-tools discussion group
 announcement (interestingly, I could do it for him). And the email I
 appended underneath my signature never got through. Can someone
 explain the blocking criteria?
I have experienced similar problems at some point. I don't know what the
blocking criteria is. Maybe an anti-spam trying to be smarter than it is?

David


 Thanks!

 Axel

 -- 
 Dr. Axel Rauschmayer
 a...@rauschma.de mailto:a...@rauschma.de

 home: rauschma.de http://rauschma.de
 twitter: twitter.com/rauschma http://twitter.com/rauschma
 blog: 2ality.com http://2ality.com

 
 Subject: Fixing one last quirk

 With Allen's decoupling [ ] and property access for collections [1],
 all of the JavaScript quirks that I can think of will be fixed in
 ECMAScript.next (including, hopefully, typeof null). Except for one:
 solving dynamic `this` is still in limbo (as far as I can tell).

 It would be really nice if it could be fixed for ES.next, it is
 surprisingly easy to get it wrong.

 Any thoughts? For me, lambda blocks would do the trick. Will those be
 in ES.next? Could functions adopt their semantics of picking up the
 `this` of the surrounding scope when not invoked as methods? It seems
 like that could work in strict mode where no one expects `this` to
 have a value.



 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Are some es-discuss submissions automatically blocked?

2011-11-07 Thread Felipe Gasper

On 11/7/11 2:48 PM, David Bruant wrote:

Le 06/11/2011 15:37, Axel Rauschmayer a écrit :

Claus Reinke could not submit his js-tools discussion group
announcement (interestingly, I could do it for him). And the email I
appended underneath my signature never got through. Can someone
explain the blocking criteria?

I have experienced similar problems at some point. I don't know what the
blocking criteria is. Maybe an anti-spam trying to be smarter than it is?


I’ve had this happen, too.

-FG
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Are some es-discuss submissions automatically blocked?

2011-11-07 Thread Axel Rauschmayer
 Claus Reinke could not submit his js-tools discussion group announcement 
 (interestingly, I could do it for him). And the email I appended underneath 
 my signature never got through. Can someone explain the blocking criteria?
 I have experienced similar problems at some point. I don't know what the 
 blocking criteria is. Maybe an anti-spam trying to be smarter than it is?


That would be my guess, too.

-- 
Dr. Axel Rauschmayer
a...@rauschma.de

home: rauschma.de
twitter: twitter.com/rauschma
blog: 2ality.com



___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Are some es-discuss submissions automatically blocked?

2011-11-07 Thread Brendan Eich
We use Google Postini in concert with Mailman. Postini needs to be told 
sometimes. If you don't see mail get through, mail es-discuss-ow...@mozilla.org 
about it.

/be

On Nov 7, 2011, at 12:48 PM, David Bruant wrote:

 Le 06/11/2011 15:37, Axel Rauschmayer a écrit :
 
 Claus Reinke could not submit his js-tools discussion group announcement 
 (interestingly, I could do it for him). And the email I appended underneath 
 my signature never got through. Can someone explain the blocking criteria?
 I have experienced similar problems at some point. I don't know what the 
 blocking criteria is. Maybe an anti-spam trying to be smarter than it is?
 
 David
 
 
 Thanks!
 
 Axel
 
 -- 
 Dr. Axel Rauschmayer
 a...@rauschma.de
 
 home: rauschma.de
 twitter: twitter.com/rauschma
 blog: 2ality.com
 
 
 Subject: Fixing one last quirk
 
 With Allen’s “decoupling [ ] and property access for collections” [1], all 
 of the JavaScript quirks that I can think of will be fixed in 
 ECMAScript.next (including, hopefully, typeof null). Except for one: solving 
 dynamic `this` is still in limbo (as far as I can tell).
 
 It would be really nice if it could be fixed for ES.next, it is surprisingly 
 easy to get it wrong.
 
 Any thoughts? For me, lambda blocks would do the trick. Will those be in 
 ES.next? Could functions adopt their semantics of picking up the `this` of 
 the surrounding scope when not invoked as methods? It seems like that could 
 work in strict mode where no one expects `this` to have a value.
 
 
 
 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss
 
 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Brendan Eich
On Nov 7, 2011, at 12:59 AM, Lasse Reichstein wrote:

 If the object being destructured is in fact a plain Array, with no
 inherited elements above the length, then there is no difference.
 This is most likely the (very) common use case. This is what the ES
 programmers' intuition will be based one.

Agreed.


 The original question was what an ES programmer would expect.
 I think he will probably expect array-like deconstructors to treat the
 RHS as an array(-like object).
 I.e., [yes,yes].

js a = [0,1]
[0, 1]
js Array.prototype[2] = 2
2
js a.length
2
js a[2]
2

We do not filter Array [[Get]] of an index that happens to name an inherited 
property based on length.

This still doesn't mean a whole lot, as you say. It's the very uncommon case. 
But it makes that with inherited indexed properties, length is not always 
overriding.


 That also have the advantage of actually providing otherwise
 unavailable functionality.
 You can write {0:x, 1:y, 2:z} instead of [x,y,z] if you want
 object-behavior, but if they are the same, you can't get array-like
 behavior.

This is a good point. Allen made it too, IIRC.


 Arrays are just an abstraction in ECMAScript, which all the
 Array.prototype methods that are intentionally generic proves. If it
 quacks like an Array and swims like an Array, we allow ourselves to
 treat it like an Array.

See above for a meow from the 
array-with-inherited-indexed-properties-not-below-length duck. But that's an 
edge case, I agree.


 I.e., I think the most easily comprehensible behavior is to make array
 destructuring treat the RHS as an Array.
 It matches the common use-case (actual arrays), it is consistent (does
 the same whether you use ... or not), and is easily explainable.

The destructuring becomes a bit more complicated, with a temporary for 
rhs.length and a one-time up-front get of that property, and lhs positional 
index tests against that length temporary. Still bugs me, probably as an 
implementor but also just in terms of more complicated desugaring.

/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Brendan Eich
On Nov 7, 2011, at 2:18 AM, Andreas Rossberg wrote:

 On 5 November 2011 17:44, Brendan Eich bren...@mozilla.com wrote:
 Destructuring is irrefutable in that it desugars to assignments from 
 properties of the RHS. It is not typed; it is not refutable
 
 I don't think that's true, at least not in the usual sense of
 irrefutable pattern. Because you can write
 
  let {x} = 666
 
 which will be refuted, by raising a TypeError.

Nope. You get undefined. That's why it's irrefutable -- you can't build 
refutable matching on this (you'd need an OOB value other than undefined, or 
exceptions).

js let {x} = 666
js x
js 


 Of course, the real question is, what does this do:
 
  let {} = 666

No-op. We worked this out for ES4, I had originally made it an early error, but 
Lars Hansen argued for the 0-identifier basis case:

js let {} = 666
js 

This can simplify code generators slightly. It's not a big deal but I agree 
with Lars, there should be no error case here.

/be___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Brendan Eich
On Nov 7, 2011, at 3:04 AM, Andreas Rossberg wrote:

 On 5 November 2011 19:55, Brendan Eich bren...@mozilla.com wrote:
 On Nov 5, 2011, at 9:38 AM, Allen Wirfs-Brock wrote:
 
 In a similar vain, what is the value of r in:
 
 let [z,y,...r] = {0:0, 1:1, 2:2, length: 3, 3:3,4:4};
 
 should it be [2] or [2,3,4]  (and if the latter how is that determined)?
 
 The inspiration for ... in the past came from (among other sources) 
 Successor ML:
 
 http://successor-ml.org/index.php?title=Functional_record_extension_and_row_capture
 
 Since I actually wrote half of that, I feel obliged to say that it
 does not answer the questions raised here. ML is a typed language, and
 contrary to popular belief, many language design problems are much
 easier to solve in a typed setting.

Absolutely. Remember, we sought inspiration there back in ES4 days, with 
optional types of some sort hovering (and eventually flying away, presumably to 
Denmark ;-).


 However, there is some inspiration in the way SML treats tuples as
 special cases of records, very much like arrays are a special case of
 objects in JS. In particular, all of SML's pattern matching rules for
 tuples follow just from the way they desugar into records with numeric
 labels.

Yes, this was our thinking for destructuring, which first appeard in Opera, got 
some ES4 wiki-level spec drafting, and fed into the SpiderMonkey and Rhino 
implementations.


 One possible semantics could be treating
 
  let [x, y, z, ...r] = e
 
 as equivalent to
 
  let {0: x, 1: y, 2: z, ..._r} = e
  let r = [].slice.call(_r, 3)
 
 where I assume the canonical matching semantics for object rest
 patterns that would make _r an ordinary object (not an array)
 accumulating all properties of e not explicitly matched (even if e
 itself is an array, in which case _r includes a copy of e's length
 property). Of course, engines would optimize properly.

Right, but why the 3 passed to slice.call if _r captured all enumerable 
properties except those with ids 0, 1, and 2 (stringified, of course)?

Anyway, you've hit what I was advocating over the weekend as the answer to the 
pair of questions I posed: [no, no]. Lasse makes a good case for [yes, yes]. I 
still think we should argue about row capture in object patterns a bit before 
concluding. What do you think?


 (But yes, row capture for objects introduces a form of object cloning,
 as Allen points out.)

Shallow, though. No closure cloning, e.g. Clone as curse-word shouldn't shoot 
this down without specific argument.

/be

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: An array destructing specification choice

2011-11-07 Thread Brendan Eich
On Nov 7, 2011, at 10:03 AM, Allen Wirfs-Brock wrote:

 True, in and instanceof don't follow the rules.  I note that they were 
 added in ES3 and I have to wonder if they aren't another case of features 
 being added without sufficient thought being given to maintaining consistency 
 of behavior throughout the language. I don't know, I wasn't there.

I was not there for those, either. I talked with the Netscape folks who were, 
though. The 'in' mismatch is even more vexing because IE was forcing at the 
time, ultimately to great success in ES5, all other implementations to treat 
for (i in null); and for (i in undefined); loops as zero-iteration non-errors.

'instanceof' is broken in a number of ways.

Let's not get into the ES3-era Object.prototype extensions, especially 
propertyIsEnumerable (which does *not* climb the prototype chain, whereas 
for/in and in do) or isPrototypeOf. Ok, I named 'em. Shutting up now.

Really I think this is more committee selection bias shift and a failure to 
review the whole to check various kinds of consistency. We need to do better, 
not saying we will or throwing stones backward in time here.


 The same can be said for a few cases in the Object functions that were added 
 for ES5.  If I had the same depth of understanding of the internals of the 
 language as I do now, I probably would have objected to those variances.

Yup. Evolution is like that.

/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: [Proxies] Refactoring prototype climbing in the spec

2011-11-07 Thread David Bruant
Hi Tom,

Thanks for all this work!

Le 07/11/2011 16:54, Tom Van Cutsem a écrit :
 Hi,

 I wrote up an initial (but fairly complete) draft of a proposed
 refactoring of the ES5 [[Get]], [[Put]] and [[HasProperty]] algorithms
 to change the way in which these algorithms climb the prototype
 chain: http://wiki.ecmascript.org/doku.php?id=strawman:refactoring_put

 This is mainly beneficial for proxies, as the prototype walking
 strategy becomes observable to proxies-used-as-prototypes.

 IMHO, the refactored algorithms interoperate with proxies in a much
 saner way, finally restoring the intuitive semantics of the get and
 set traps that MarkM and I had in mind from the beginning, but which
 Sean Eagan pointed out were flawed given the ES5 spec algorithms.
I am a big fan of this refactoring.
I like the idea that the algorithm transfers full control to the proxy.
Also, accepting this strawman would solve my inherited event properties
problem because I would have access to the receiver from the get trap
and would be able to do the binding I want.

About [[SetP]]:
* It seems to me that we've traded a duplicated [[GetProperty]] call for
a duplicated [[GetOwnProperty]] (Step 1 and 5.a) call on the receiver.
This could be avoided by storing the property descriptor at the
beginning (when O.[[setP]](Receiver, P, V) is called and O ===
receiver). Step 5.a could be remove by the change.
* If step 5.a is removed, I think that step 5.b.i is useless, because we
would have returned from the set when O was the receiver (by step 2.a of
[[setP]])
* If step 5.a is removed, then step 5.c is useless, because if the desc
had a [[set]], then we would have already returned from one of the
substep of step 3 when O was the receiver

Why not redefining [[Get]] as what you have defined as [[GetP]] and
equivalent for [[Set]] and [[SetP]]?
Current 8.12.3 [[Get]] (P) would become [[Get]] (Receiver, P). It would
be called with O as initial value for Receiver pretty much everywhere in
the spec except within [[Get]] recursive calls.
Equivalent for [[Set]].
It would prevent the replacement of 2 Object.* methods by 2 others.

With the refactoring, on the direct proxy strawman, I don't think we
need the proxy argument for the get and set traps anymore. It was here
as the receiver, but since we have the receiver itself, the get and set
trap can just have the receiver and the target as argument.

David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


New ES6 draft posted to wiki

2011-11-07 Thread Allen Wirfs-Brock
The November 7, 211 (also known as Rev 4) draft is available at 
http://wiki.ecmascript.org/doku.php?id=harmony:specification_drafts 

Let me know if you really need to docx format copy and I'll send it two you.  I 
haven't found a way to get docuwiki to allow me to upload docx files.  It seems 
to think they are a evil phishing zip file.

Allen___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss