Re: July 25, 2012 - TC39 Meeting Notes

2012-08-14 Thread Andreas Rossberg
On 29 July 2012 03:58, Brendan Eich bren...@mozilla.org wrote:
 Allen Wirfs-Brock wrote:
 I really think in a language where we have both [[Put]] and
 [[DefineOwnProperty]] semantics that we really need both = and :=

 I can buy that, and I'm glad you mention := as it is not just an assignment
 operator (e.g. in Pascal or Ada), it's also Go's declare-and-init operator.
 It has the right characters, fuzzy meaning from other languages, and the
 critical = char in particular.

There is a far longer tradition and a significantly larger body of
languages that use = for definition and := for assignment (including
all languages in the Algol  Pascal tradition). So going with an
inverted meaning in JS sounds like an awful idea to me (as does using
Go for inspiration about anything related to declaration syntax ;) ).

/Andreas
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-14 Thread Allen Wirfs-Brock

On Aug 14, 2012, at 4:20 AM, Andreas Rossberg wrote:

 On 29 July 2012 03:58, Brendan Eich bren...@mozilla.org wrote:
 Allen Wirfs-Brock wrote:
 I really think in a language where we have both [[Put]] and
 [[DefineOwnProperty]] semantics that we really need both = and :=
 
 I can buy that, and I'm glad you mention := as it is not just an assignment
 operator (e.g. in Pascal or Ada), it's also Go's declare-and-init operator.
 It has the right characters, fuzzy meaning from other languages, and the
 critical = char in particular.
 
 There is a far longer tradition and a significantly larger body of
 languages that use = for definition and := for assignment (including
 all languages in the Algol  Pascal tradition). So going with an
 inverted meaning in JS sounds like an awful idea to me (as does using
 Go for inspiration about anything related to declaration syntax ;) ).

About as awful as using [ ] as the indexing operator when every FORTRAN 
programmer knows that ( ) is how  you do subscripting.  Not to mention what 
Smalltalk programmers think [ ] means.

There is value in using familiar looking symbols but I think it is unrealistic 
to expect common semantics among different languages.

Allen





 
 /Andreas
 

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-14 Thread Brendan Eich

Allen Wirfs-Brock wrote:

On Aug 14, 2012, at 4:20 AM, Andreas Rossberg wrote:


  On 29 July 2012 03:58, Brendan Eichbren...@mozilla.org  wrote:

  Allen Wirfs-Brock wrote:

  I really think in a language where we have both [[Put]] and
  [[DefineOwnProperty]] semantics that we really need both = and :=
  
  I can buy that, and I'm glad you mention := as it is not just an assignment

  operator (e.g. in Pascal or Ada), it's also Go's declare-and-init operator.
  It has the right characters, fuzzy meaning from other languages, and the
  critical = char in particular.
  
  There is a far longer tradition and a significantly larger body of

  languages that use = for definition and := for assignment (including
  all languages in the Algol  Pascal tradition). So going with an
  inverted meaning in JS sounds like an awful idea to me (as does using
  Go for inspiration about anything related to declaration syntax;)  ).


About as awful as using [ ] as the indexing operator when every FORTRAN 
programmer knows that ( ) is how  you do subscripting.  Not to mention what 
Smalltalk programmers think [ ] means.

There is value in using familiar looking symbols but I think it is unrealistic 
to expect common semantics among different languages.


After more soak-time on this, I'm on Andreas's side.

Yes, symbols will be used differently by different languages. No, () for 
indexing is not expected in modern languages -- Fortran like Disco and 
the American Drive-In may never die, but it is rare to find in the wild 
or taught in universities.


Doug's confusion was not unique. We may want syntax for redefinition, 
but assignment is the dominant trope and it will still be even with := 
or - or whatever the syntax might be. Perhaps syntax is not needed so 
much as Object.define and good docs for when to use it.


/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-02 Thread Axel Rauschmayer
What’s the best material for reading up on the “override mistake”?
This? http://wiki.ecmascript.org/doku.php?id=strawman:fixing_override_mistake


On Aug 1, 2012, at 21:56 , Mark S. Miller erig...@google.com wrote:

 On Tue, Jul 31, 2012 at 9:05 PM, Brendan Eich bren...@mozilla.org wrote:
 This was debated at last week's TC39 meeting. Between the desire to preserve
 this symmetry (not paramount, there are many dimensions and symmetries to
 consider) and the V8 bug being fixed (and the JSC bug on which the V8 bug
 was based already being fixed in iOS6), I believe we kept consensus to
 follow the spec.
 
 For the record, I continue to think this is a bad idea, and that we
 should lose the symmetry for gains elsewhere. So I'd say we failed to
 gain consensus to change the spec. Since consensus is needed to change
 the spec, the spec is likely to remain unchanged in this regard.
 
 
 -- 
Cheers,
--MarkM
 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss
 

-- 
Dr. Axel Rauschmayer
a...@rauschma.de

home: rauschma.de
twitter: twitter.com/rauschma
blog: 2ality.com

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-02 Thread Mark S. Miller
Yup. It's not very much, but since it seems hopeless it's hard to find
time to write the rest.

On Thu, Aug 2, 2012 at 4:02 PM, Axel Rauschmayer a...@rauschma.de wrote:
 What’s the best material for reading up on the “override mistake”?
 This?
 http://wiki.ecmascript.org/doku.php?id=strawman:fixing_override_mistake


 On Aug 1, 2012, at 21:56 , Mark S. Miller erig...@google.com wrote:

 On Tue, Jul 31, 2012 at 9:05 PM, Brendan Eich bren...@mozilla.org wrote:

 This was debated at last week's TC39 meeting. Between the desire to preserve
 this symmetry (not paramount, there are many dimensions and symmetries to
 consider) and the V8 bug being fixed (and the JSC bug on which the V8 bug
 was based already being fixed in iOS6), I believe we kept consensus to
 follow the spec.


 For the record, I continue to think this is a bad idea, and that we
 should lose the symmetry for gains elsewhere. So I'd say we failed to
 gain consensus to change the spec. Since consensus is needed to change
 the spec, the spec is likely to remain unchanged in this regard.


 --
Cheers,
--MarkM
 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss


 --
 Dr. Axel Rauschmayer
 a...@rauschma.de

 home: rauschma.de
 twitter: twitter.com/rauschma
 blog: 2ality.com




-- 
Cheers,
--MarkM
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-02 Thread Axel Rauschmayer
I’m possibly repeating old arguments, but if the “mistake” was fixed in ES6, 
you could still get the ES5.1 behavior by introducing a setter that throws an 
exception, right?

On Aug 3, 2012, at 1:04 , Mark S. Miller erig...@google.com wrote:

 Yup. It's not very much, but since it seems hopeless it's hard to find
 time to write the rest.
 
 On Thu, Aug 2, 2012 at 4:02 PM, Axel Rauschmayer a...@rauschma.de wrote:
 What’s the best material for reading up on the “override mistake”?
 This?
 http://wiki.ecmascript.org/doku.php?id=strawman:fixing_override_mistake

-- 
Dr. Axel Rauschmayer
a...@rauschma.de

home: rauschma.de
twitter: twitter.com/rauschma
blog: 2ality.com

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-02 Thread Mark S. Miller
yes. Or simply an accessor property without a setter.

On Thu, Aug 2, 2012 at 4:08 PM, Axel Rauschmayer a...@rauschma.de wrote:
 I’m possibly repeating old arguments, but if the “mistake” was fixed in ES6,
 you could still get the ES5.1 behavior by introducing a setter that throws
 an exception, right?

 On Aug 3, 2012, at 1:04 , Mark S. Miller erig...@google.com wrote:

 Yup. It's not very much, but since it seems hopeless it's hard to find
 time to write the rest.

 On Thu, Aug 2, 2012 at 4:02 PM, Axel Rauschmayer a...@rauschma.de wrote:

 What’s the best material for reading up on the “override mistake”?
 This?
 http://wiki.ecmascript.org/doku.php?id=strawman:fixing_override_mistake


 --
 Dr. Axel Rauschmayer
 a...@rauschma.de

 home: rauschma.de
 twitter: twitter.com/rauschma
 blog: 2ality.com




-- 
Cheers,
--MarkM
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-01 Thread David Bruant

Le 01/08/2012 00:05, Brendan Eich a écrit :

David Bruant wrote:
From a practical point of view, if 2 implementations differ on one 
aspect of the language, it means that there is no content relying on 
either of the 2 implementations for that aspect of the language, 
whether they follow the spec or even both diverge differently from it.


It's not that simple on the web. For instance, longstanding IE vs. 
Netscape/Mozilla forking, e.g.


  if (document.all) { ... } else { ... }

can mean some divergence among the else browsers is ok because not 
relied on there, but not ok in the then case.

What an intricated case.
By the way, I recall something I learned from @mathias. In Chrome:

console.log(document.all); // shows an object in the console
console.log(typeof document.all) // undefined
'all' in document // true
console.log(!!document.all) // false

Such a thing cannot be represented in pure ECMAScript, not even with 
proxies. I don't think there is anything that can be done in ECMAScript 
to fix this, but it's worth sharing this information.


You're probably right, but we are not making data and accessors 
asymmetric in the sense that a non-writable data property and a 
get-only accessor on a prototype object both throw (strict) or 
silently fail to update the LHF (non-strict) on assignment that would 
otherwise create a shadowing property in a delegating object.


This was debated at last week's TC39 meeting. Between the desire to 
preserve this symmetry (not paramount, there are many dimensions and 
symmetries to consider) and the V8 bug being fixed (and the JSC bug on 
which the V8 bug was based already being fixed in iOS6), I believe we 
kept consensus to follow the spec.
That's fine. I was only noting that the door was open. There is no 
reason to be forced to take it. Interoperability is however a good 
reason to make a choice whatever it is.


David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-01 Thread Brendan Eich

David Bruant wrote:

By the way, I recall something I learned from @mathias. In Chrome:

console.log(document.all); // shows an object in the console
console.log(typeof document.all) // undefined
'all' in document // true
console.log(!!document.all) // false

Such a thing cannot be represented in pure ECMAScript, not even with 
proxies. I don't think there is anything that can be done in 
ECMAScript to fix this, but it's worth sharing this information.


This originated in SpiderMonkey for Firefox 1.0, see

https://bugzilla.mozilla.org/show_bug.cgi?id=246964

There, I used a cheap heuristic bytecode analysis to distinguish 
undetected document.all uses, which some content featured (the authors 
assumed IE only; IE touched 95% market share in 2002), from property 
object-detected uses. The latter must be falsy but the former could be 
emulated for greater de-facto web compatibility.


Later, WebKit solved the same problem with a masqueradesAsUndefined flag 
set on certain objects, rather than code analysis. This is similar to 
how value objects 
(http://wiki.ecmascript.org/doku.php?id=strawman:value_objects, 
https://bugzilla.mozilla.org/show_bug.cgi?id=749786) can be falsy.


But notice how value objects are immutable and so compare === by 
shallow-enough value. That is not how document.all works -- it's a 
mutable magic/live collection.


We might end up standardizing something for value proxies or value 
objects that allows JS to self-host this undetected document.all 
emulation hack. No promises, and no rush.


/be

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-01 Thread Mark S. Miller
On Tue, Jul 31, 2012 at 9:05 PM, Brendan Eich bren...@mozilla.org wrote:
 This was debated at last week's TC39 meeting. Between the desire to preserve
 this symmetry (not paramount, there are many dimensions and symmetries to
 consider) and the V8 bug being fixed (and the JSC bug on which the V8 bug
 was based already being fixed in iOS6), I believe we kept consensus to
 follow the spec.

For the record, I continue to think this is a bad idea, and that we
should lose the symmetry for gains elsewhere. So I'd say we failed to
gain consensus to change the spec. Since consensus is needed to change
the spec, the spec is likely to remain unchanged in this regard.


-- 
Cheers,
--MarkM
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-01 Thread Brendan Eich

Mark S. Miller wrote:

On Tue, Jul 31, 2012 at 9:05 PM, Brendan Eichbren...@mozilla.org  wrote:

This was debated at last week's TC39 meeting. Between the desire to preserve
this symmetry (not paramount, there are many dimensions and symmetries to
consider) and the V8 bug being fixed (and the JSC bug on which the V8 bug
was based already being fixed in iOS6), I believe we kept consensus to
follow the spec.


For the record, I continue to think this is a bad idea, and that we
should lose the symmetry for gains elsewhere. So I'd say we failed to
gain consensus to change the spec. Since consensus is needed to change
the spec, the spec is likely to remain unchanged in this regard.


Fair enough -- sorry I didn't represent this accurately.

But this reminds me to ask: what do you think of Allen's := proposal as 
the better mustache? I realize it doesn't help the Caja vs. legacy problem.


/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-08-01 Thread Mark S. Miller
For non-legacy code, given classes and triangle, I don't see the
override mistake as much of a pain point. For co-existence of the
override mistake with legacy code, the only reasonable choice we've
come up with is
http://code.google.com/p/es-lab/source/browse/trunk/src/ses/repairES5.js#347,
which, as you can see, is painful, slow, and unreliable. But I have to
admit that it seems to work well enough in practice.

On Wed, Aug 1, 2012 at 12:58 PM, Brendan Eich bren...@mozilla.org wrote:
 Mark S. Miller wrote:

 On Tue, Jul 31, 2012 at 9:05 PM, Brendan Eichbren...@mozilla.org  wrote:

 This was debated at last week's TC39 meeting. Between the desire to
 preserve
 this symmetry (not paramount, there are many dimensions and symmetries to
 consider) and the V8 bug being fixed (and the JSC bug on which the V8 bug
 was based already being fixed in iOS6), I believe we kept consensus to
 follow the spec.


 For the record, I continue to think this is a bad idea, and that we
 should lose the symmetry for gains elsewhere. So I'd say we failed to
 gain consensus to change the spec. Since consensus is needed to change
 the spec, the spec is likely to remain unchanged in this regard.


 Fair enough -- sorry I didn't represent this accurately.

 But this reminds me to ask: what do you think of Allen's := proposal as the
 better mustache? I realize it doesn't help the Caja vs. legacy problem.

 /be



-- 
Cheers,
--MarkM
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-31 Thread Allen Wirfs-Brock
The following was WRT [[Put]]/[[CanPut]] semantic issues:  

On Jul 28, 2012, at 6:02 AM, David Bruant wrote:

 Le 28/07/2012 14:37, Herby Vojčík a écrit :
 ...
 :-/ But that is how it is, no?
 That's what the spec says, but V8 has implemented something else (and I
 haven't seen an intention to change this behavior), so what the spec
 says doesn't really matter.
 
 David

I have to disagree with David's sentiments here. Situations like this is 
exactly why we have standardized specifications. Different implementors can 
easily have differing interpretations about the edge case semantics of loosely 
described features. An important role of standards is to align implementations 
on a common semantics. Sure, an implementation can refuse to go along with the 
specification but that is quite rare, at least for ECMAScript where all major 
implementations seem to recognize the importance of interoperability. In 
particular, I haven't seen any indication that V8, as a matter of policy, is 
refusing to ever correct this deviations.

It's true that what the spec. says makes no difference to the browser bits that 
have already been shipped.  It does make a difference over the long term.  
Single implementation deviations from the specification usually get fixed 
eventually. Conformance to the specs. is a motivator for implementors. 

We really shouldn't foster the meme that  specs don't really matter.  they 
matter a lot.

Allen
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-31 Thread Brendan Eich

David Bruant wrote:

That's what the spec says, but V8 has implemented something else (and I
haven't seen an intention to change this behavior), so what the spec
says doesn't really matter.


I missed this until Allen's reply called it out. It is both false 
(Google people at the TC39 meeting last week said it's a V8 bug that is 
going to be fixed), and a stinky statement of anti-realpolitik. In the 
current market, if we don't hew to a consensus standard, anything goes.


Not that everyone would make breaking changes, or any changes, just that 
from the presence of a bug or long-standing deviation (in this case 
copied from JavaScriptCore, which has since fixed the deviation!) does 
*not* mean that what the spec says doesn't really matter.


Or were you snarking at V8?

/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-31 Thread Erik Arvidsson
On Sat, Jul 28, 2012 at 6:02 AM, David Bruant bruan...@gmail.com wrote:
 That's what the spec says, but V8 has implemented something else (and I
 haven't seen an intention to change this behavior), so what the spec
 says doesn't really matter.

We have a fix for V8 (--es5_readonly) but the Chromium bindings to the
DOM still has bugs related to this flag. I plan to have this fixed in
the coming weeks.

-- 
erik
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-31 Thread David Bruant

I think my message has been taken the wrong way, so I should clarify it:
From a practical point of view, if 2 implementations differ on one 
aspect of the language, it means that there is no content relying on 
either of the 2 implementations for that aspect of the language, whether 
they follow the spec or even both diverge differently from it.
Not having content relying on this aspect also opens the door to 
changing the behavior if it's considered as a mistake. This applies to 
the override mistake, but every single test262 test failure on any 
major implementation could also be seen as an instance of such an open 
door to change the spec.
what the spec says doesn't really matter seems very negative taken out 
of context, but within the context i said it, it meant something 
positive along the lines of there is room to improve the specification 
on this part if necessary

That really is all what I meant, no more, no less.

Also... hmm... I wouldn't be spending that much time on standards 
mailing-list, this one included, if I didn't believe in standardization ;-)



Detailed answer below.

Le 31/07/2012 14:07, Allen Wirfs-Brock a écrit :

The following was WRT [[Put]]/[[CanPut]] semantic issues:

On Jul 28, 2012, at 6:02 AM, David Bruant wrote:


Le 28/07/2012 14:37, Herby Vojčík a écrit :

...
:-/ But that is how it is, no?

That's what the spec says, but V8 has implemented something else (and I
haven't seen an intention to change this behavior), so what the spec
says doesn't really matter.

David

I have to disagree with David's sentiments here. Situations like this is 
exactly why we have standardized specifications.

I agree.
But I'd like to add that situations like this also show the limitations 
of standardized specifications (more on that at the end)



Different implementors can easily have differing interpretations about the edge 
case semantics of loosely described features. An important role of standards is 
to align implementations on a common semantics. Sure, an implementation can 
refuse to go along with the specification but that is quite rare, at least for 
ECMAScript where all major implementations seem to recognize the importance of 
interoperability.
I do the opposite analysis: major implementations recognize the 
importance of interoperability due to market constraints, thus the need 
for a standard.
Although almost no one talks about it these days, I think the most 
important part of HTML5 was specifying what's already in some browsers, 
making clear for the other browsers what to implement to be interoperable.



In particular, I haven't seen any indication that V8, as a matter of policy, is 
refusing to ever correct this deviations.

It's true that what the spec. says makes no difference to the browser bits that 
have already been shipped.  It does make a difference over the long term.  
Single implementation deviations from the specification usually get fixed 
eventually. Conformance to the specs. is a motivator for implementors.

We really shouldn't foster the meme that  specs don't really matter.  they 
matter a lot.
I hope I have clarified that I don't buy into the meme that specs don't 
matter. I was only reacting to the fact the 2 major implementations 
differ on one aspect of the spec, making in practice what the spec says 
on that aspect useless.



Brendan Eich wrote:
I missed this until Allen's reply called it out. It is both false 
(Google people at the TC39 meeting last week said it's a V8 bug that 
is going to be fixed)
it's unfortunate this information wasn't on the meeting notes, but I'm 
glad to hear it :-)


and a stinky statement of anti-realpolitik. In the current market, if 
we don't hew to a consensus standard, anything goes.


Not that everyone would make breaking changes, or any changes, just 
that from the presence of a bug or long-standing deviation (in this 
case copied from JavaScriptCore, which has since fixed the deviation!) 
does *not* mean that what the spec says doesn't really matter.
I guess I should have added here at the end of my sentence to clarify 
that I didn't mean that the whole spec doesn't matter, but only the part 
about [[CanPut]]/[[Put]] that's not interoperably implemented.



Or were you snarking at V8?

I was not.


More on the limitations of standardization I talked about above.
As I said, I understand the importance of a standard and I don't buy in 
the idea they are useless. I also don't buy in the idea that standards 
should be seen written-in-stone documents. We all know that specs 
sometimes have mistakes in them and when it's necessary and possible, 
they are fixed. It was discovered that ES5 had such a mistake [1] and 
the standard has been consequently fixed. This change additionally to 
implementations following make that what was said in the spec about 
Object.prototype.toString before the fix did not matter (only the part 
that was controversial). The fact that it did not matter was actually a 
pre-requisite to being 

Re: July 25, 2012 - TC39 Meeting Notes

2012-07-31 Thread Brendan Eich

David Bruant wrote:
From a practical point of view, if 2 implementations differ on one 
aspect of the language, it means that there is no content relying on 
either of the 2 implementations for that aspect of the language, 
whether they follow the spec or even both diverge differently from it.


It's not that simple on the web. For instance, longstanding IE vs. 
Netscape/Mozilla forking, e.g.


  if (document.all) { ... } else { ... }

can mean some divergence among the else browsers is ok because not 
relied on there, but not ok in the then case.


You're probably right, but we are not making data and accessors 
asymmetric in the sense that a non-writable data property and a get-only 
accessor on a prototype object both throw (strict) or silently fail to 
update the LHF (non-strict) on assignment that would otherwise create a 
shadowing property in a delegating object.


This was debated at last week's TC39 meeting. Between the desire to 
preserve this symmetry (not paramount, there are many dimensions and 
symmetries to consider) and the V8 bug being fixed (and the JSC bug on 
which the V8 bug was based already being fixed in iOS6), I believe we 
kept consensus to follow the spec.


/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-31 Thread Sam Tobin-Hochstadt
On Wed, Aug 1, 2012 at 12:05 AM, Brendan Eich bren...@mozilla.org wrote:
 (and the JSC bug on which the V8 bug was based already being fixed in iOS6)

Just to nitpick for those following along at home, the bug is fixed in
the just-release *Safari* 6, and @ohunt declined to comment on future
products or releases :)

-- 
sam th
sa...@ccs.neu.edu
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Allen Wirfs-Brock

On Jul 28, 2012, at 6:58 PM, Brendan Eich wrote:

 Allen Wirfs-Brock wrote:
 I really think in a language where we have both [[Put]] and 
 [[DefineOwnProperty]] semantics that we really need both = and :=
 
 I can buy that, and I'm glad you mention := as it is not just an assignment 
 operator (e.g. in Pascal or Ada), it's also Go's declare-and-init operator. 
 It has the right characters, fuzzy meaning from other languages, and the 
 critical = char in particular. I could see adding it as a winning and better 
 user-interafce to Object.defineProperty and even Object.extend.
 
 However, Object.extend is usable in old and new browsers, with polyfill. 
 Object.extend is the cowpath trod by many happy cows. Requiring a transpiler 
 is harsh. Should we entertain both := and Object.extend, or perhaps the 
 better name to avoid colliding with PrototypeJS, Object.define or 
 Object.update?

I think we should view := as new syntax that is primarily intended to be used 
in combination with other new syntax such as concise methods, class 
definitions, super, etc. that also require transpiling for use in older 
versions.   It is neither more or less harsh than any other new syntax. When we 
incorporate new syntax we are primarily making an investment for future ES 
programmers.  Transition issues need to be considered but I think that for ES, 
the future is still much longer and larger than the past.

The problem with Object.extend is that it isn't a single cow path.  There are 
multiple path leading in the same general direction but taking different 
routes. This was the case in 2008 when we considered adding Object.extend for 
ES5 and it is even more so now.  We could add a completely new function such as 
Object.update, but I wonder if that is really needed.  Frameworks seem to be 
dong a fine job providing their own variants of Object.extend-like functions 
that are fine tuned to match their own abstraction models and other 
requirements.  A polyfill with semantics that are different from someone's 
favorite framework might just cause confusion, even if it uses a different 
name.  Are things still going to work if I use Object.update instead of 
ProtoyypesJS's Object.extend in a PrototypeJS environment?  Maybe not?  Same 
for other frameworks and other extend-like functions.  Rather than sowing 
confusion in the current ES3/5 framework world with a new polyfill, it might be 
better
  to simply leave things be WRT a standardized extends-like function.  := would 
be a new syntactic ES6 syntactic form that works in combination with other new 
ES6 syntactic forms.  Legacy code and frameworks with their own extend-like 
functions would all continue to work in that ES6.  New ES6 code probably 
doesn't need a procedural form of := (or if they do they could easily define 
it:  Object.update=(obj1, obj2) = obj1 := obj2; ).

Cowpath are important for telling us where the cows need to go but they are 
constrained by the current terrain.  Introducing an  syntactic operator such as 
:= is like building an elevated freeway  that goes straight to the destination 
above the current cowpaths. It allows the old cows to continue to follow their 
established paths for as long as they need to, but don't constrain future 
high-speed travelers to following those old paths. 

 
 Finally, this discussion caused me to realize that I messed-up  on an 
 important detail when I prepared and presented the class semantics deck 
 (http://t.co/PwuF12Y0) at the TC39 meeting.
 
 In the deck, I incorrectly stated that I was proposing that the attributes 
 associate with a property created via a concise method definition (in a 
 class or object literal definition) should have the attributes {writable: 
 true, configurable: false}. I had a hard time defending that choice at the 
 meeting.
 ...
 On balance, I like := as a complement to =, but I'm leery of new-version-only 
 thinking that leaves out Object.extend or better-named use-cases. And I am 
 skeptical that any of this means non-writable configurable is a sane 
 attribute combo for methods.

I made my case above. I think this is a situation where new-version-only 
feature design is fine (but getting there requires thinking about old 
versions).  The current framework writers have things under-control for 
current/old versions.  Sure it would have been great if there had been a 
standard extend-like function prior to the creation of modern frameworks, but 
there wasn't.  Rather than throwing ripples through the current frameworks it 
may be better for us to focus on how new things will be done with the new 
version.

Allen





___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Allen Wirfs-Brock

On Jul 29, 2012, at 6:05 AM, Herby Vojčík wrote:
 
 
 Brendan Eich wrote:
 ...
 
 However TC39 favored configurable: true as well as writable: true, to
 match JS expectations, in particular that one could reconfigure a data
 property (which is what method definition syntax in a class body
 creates) with a workalike accessor.
 
 I don't understand here.
 1. Method syntax does not create data property (well, technically, yes, but I 
 already think in terms of configurable: true, writable: false which was 
 present some time already in wiki / es-discuss).
 2. To replace data property with workalike accessor, you use 
 Object.defineProperty, so why would you need writable: true? It is not needed 
 there at all.

Yes, this is close to what I was thinking.  While concise methods are 
implemented as  data properties  they should not be thought of as part of the 
mutable state contract of an object. Conceptually, they are not the same thing 
as a closure-valued instance variable.  I expect that future ES6 style guides 
with say something like:

Use concise method notation to define behavior properties of object whose 
modification is not part of the objet's contract. Use : data property notation 
to define state properties that are expected to be modified.  Always use : data 
properties in cases where the dynamic modification of a function-valued 
property is expected and part of the object's contract. For example:

class Foo {
report () {this.callback()}  //a prototype method that is a fixed part of 
the Foo interface. 
constructor (arg) {
   this := {
 callback : () = undefined,   //default per-instance callback.  
clients are expected to modify
 doIt () {doSomethingWith(this, arg)} // a per-instance method that 
captures some constructor state, clients are not expected to modify
  }
}

let f = new Foo(thing);
f.callback = () = console.log('called back');   //this sort of assignment is 
expected.  
f.doIt = function () {...};  //this isn't expected.  It is patching the class 
definition. Avoid this. 
f := {doIt () {...} };//instead this is how you should patch class 
definitions.

The concise method attribute values I suggested were intended as a means of 
making this guideline a bit more than just a convention.

Allen___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Brendan Eich

Allen Wirfs-Brock wrote:

On Jul 28, 2012, at 6:58 PM, Brendan Eich wrote:

Allen Wirfs-Brock wrote:

I really think in a language where we have both [[Put]] and 
[[DefineOwnProperty]] semantics that we really need both = and :=

I can buy that, and I'm glad you mention := as it is not just an assignment 
operator (e.g. in Pascal or Ada), it's also Go's declare-and-init operator. It 
has the right characters, fuzzy meaning from other languages, and the critical 
= char in particular. I could see adding it as a winning and better 
user-interafce to Object.defineProperty and even Object.extend.

However, Object.extend is usable in old and new browsers, with polyfill. 
Object.extend is the cowpath trod by many happy cows. Requiring a transpiler is 
harsh. Should we entertain both := and Object.extend, or perhaps the better 
name to avoid colliding with PrototypeJS, Object.define or Object.update?


I think we should view := as new syntax that is primarily intended to be used 
in combination with other new syntax such as concise methods, class 
definitions, super, etc. that also require transpiling for use in older 
versions.   It is neither more or less harsh than any other new syntax. When we 
incorporate new syntax we are primarily making an investment for future ES 
programmers.  Transition issues need to be considered but I think that for ES, 
the future is still much longer and larger than the past.


Yes, I agree with that (as stated; it doesn't help with balancing 
polyfillability or making the right call on configurable+-writable).



The problem with Object.extend is that it isn't a single cow path.  There are 
multiple path leading in the same general direction but taking different 
routes. This was the case in 2008 when we considered adding Object.extend for 
ES5 and it is even more so now.  We could add a completely new function such as 
Object.update, but I wonder if that is really needed.


The JSFixed project had Object.extend among its curated/moderated 
outcomes and I think it's a reasonable request. We rolled up 
Function.prototype.bind into ES5 in spite of several differences among 
the leading implementations (Dojo hitch, Prototype bind, etc.) and we 
changed the ES5 draft as we went.



   Frameworks seem to be dong a fine job providing their own variants of 
Object.extend-like functions that are fine tuned to match their own abstraction 
models and other requirements.


This is not a sufficient argument on its face, since we got bind into 
ES5 in spite of variation.


Anyway, given the issue I raised about lack of writability being hard to 
test, or really: unlikely to be tested in practice, I don't think := (a 
good idea) motivates configurable+non-writable.


/be

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread David Bruant

Le 28/07/2012 21:04, Allen Wirfs-Brock a écrit :

(...)
We introduce a new operator that is looks like  :=

This is the define properties operator.  Both the LHS and RHS must 
be objects (or ToObject convertible).  Its semantics is to 
[[DefineOwnProperty]] on the LHS obj a property corresponding to each 
RHS own property.  I does this with all reflectable own properties. It 
includes non-enumerable properties and unique named properties but not 
non-reflectable /private/ name properties.  It rebinds methods with 
super bindings to the RHS to new methods that are super bound to the LHS.


The above example would then be written as:

a := {
  push(elem) {
...
  }
};
rather than, perhaps incorrectly as:

a.push = function (elem) {
...
};

or, correctly but very inconveniently as:

Object.defineProperty(a, push, {writable: true, configurable: true, 
enumberable: true,

data:function (elem) {
I see the typo here ('data' instead of 'value') as one of the most 
brilliant and unexpected example of this inconvenience :-)
And I'm not even talking about 'enumberable' which I also trip over 
almost all the time to the point of making this syntax (en)um-bearable!



...
}
}
);

(...)

I really think in a language where we have both [[Put]] and 
[[DefineOwnProperty]] semantics that we really need both = and :=
That's an interesting view on things. To me, it would make acceptable 
the idea of = being unreliable locally without prior knowledge (which, 
as noted, kind-of-already-is because of inherited setters) while := 
(which is more ':={}' actually, aka big-lips-guy) enables reliable local 
review without prior knowledge, proxy pathological cases aside.


David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Allen Wirfs-Brock

On Jul 30, 2012, at 1:08 PM, Brendan Eich wrote:

 Allen Wirfs-Brock wrote:
 ...
 
 The problem with Object.extend is that it isn't a single cow path.  There 
 are multiple path leading in the same general direction but taking different 
 routes. This was the case in 2008 when we considered adding Object.extend 
 for ES5 and it is even more so now.  We could add a completely new function 
 such as Object.update, but I wonder if that is really needed.
 
 The JSFixed project had Object.extend among its curated/moderated outcomes 
 and I think it's a reasonable request. We rolled up Function.prototype.bind 
 into ES5 in spite of several differences among the leading implementations 
 (Dojo hitch, Prototype bind, etc.) and we changed the ES5 draft as we went.

Adding Object.extend is not totally comparable to adding 
Function.prototype.bind in ES5.  The big difference is that the core semantics 
of the various framework provided bind functions and the  ES5 bind were 
essentially identical.  The differences only involved very obscure edge cases. 
This enables the ES5 bind to replace the frameworks' binds (and visa versa) 
with minimal disruption.  There isn't anything like a universally accepted 
semantics for Object.extend.  In particular, the semantics proposed [1] by 
JSFixed is different from the semantics defined for that same name by 
prototype.js [2].  Neither of them correctly deal with accessor properties.   

I think the JSFixed proposal [2] (and its associated issue discussion [3]) is a 
strong indication that there is significant perceived utility in a feature that 
enables bulk replication of properties from one object to another.  However, 
there is almost no discussion in [3] of the compatibility impact of 
standardizing a function named Object.extend.  Given that and the semantic 
issues (handling of accessor, etc.)  I don't think the the exact JSFixed 
proposal is  particularly reasonable.  More strongly I think that adding a 
standard function named Object.extend is likely to be disruptive.  I don't 
really object to a polyfillable function that has the same semantics as that 
proposed for := as long as it does have a name that conflicts with widely used 
legacy code.  I do, however,  question the necessity of such a function in 
light of the current adequate support provided by frameworks. 
 
   Frameworks seem to be dong a fine job providing their own variants of 
 Object.extend-like functions that are fine tuned to match their own 
 abstraction models and other requirements.
 
 This is not a sufficient argument on its face, since we got bind into ES5 in 
 spite of variation.

I really think the situation is different this time. The commonly used 
semantics of ES5's bind did not differ significantly any other widely used 
implementation of a Function.prototype.bind method. so replacing one with the 
other wasn't disruptive. Object.extends and similar but differently named or  
located framework functions are not nearly as well aligned in their core 
semantics.

 
 Anyway, given the issue I raised about lack of writability being hard to 
 test, or really: unlikely to be tested in practice, I don't think := (a good 
 idea) motivates configurable+non-writable.

yes, a separate issue. I still think configurable+non-writable is defendable 
(actually better) but it's a pretty minor issue that I won't loose sleep over.

 
 /be
 


[1] 
https://docs.google.com/document/d/1JPErnYlBPG26chTuVSnJ_jqW4YkiQhvWn-FxwwsmkEo/edit
 
[2] 
https://github.com/sstephenson/prototype/blob/master/src/prototype/lang/object.js#L72
 
[3] https://github.com/JSFixed/JSFixed/issues/16 ___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Aymeric Vitte


Le 28/07/2012 01:55, Rick Waldron a écrit :


Explanation of specification history and roots in newer DOM mutation 
mechanism.


AWB: Is this sufficient for implementing DOM mutation event mechanisms?

RWS: Yes, those could be built on top of Object.observe


Probably I must be misreading the proposal (again), but if you take a js 
DOM project where almost all attributes are handled via getters/setters, 
how can we observe something ?


--
jCore
Email :  avi...@jcore.fr
Web :www.jcore.fr
Webble : www.webble.it
Extract Widget Mobile : www.extractwidget.com
BlimpMe! : www.blimpme.com

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Brendan Eich

Allen Wirfs-Brock wrote:
The commonly used semantics of ES5's bind did not differ significantly 
any other widely used implementation of a Function.prototype.bind 
method. so replacing one with the other wasn't disruptive.

Could be, but there were differences:

https://mail.mozilla.org/pipermail/es-discuss/2012-January/019382.html

I think you're on thin ice arguing this was so much less signfiicant 
than Object.extend (or let's say Object.update).


Object.extends and similar but differently named or  located framework 
functions are not nearly as well aligned in their core semantics.


First, differently named applied to bind precursors, e.g. Dojo's hitch.

Second, here's a post from jresig years ago:

https://mail.mozilla.org/pipermail/es-discuss/2008-July/006709.html

This is out of date, but note how for-in is used in all cases. There's a 
lot of common ground here, and some uncommon bits that look not a whole 
lot bigger or different-in-kind from the bind/hitch/etc. ones we 
overcame in ES5.


/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Allen Wirfs-Brock

On Jul 30, 2012, at 2:56 PM, Brendan Eich wrote:

 Allen Wirfs-Brock wrote:
 The commonly used semantics of ES5's bind did not differ significantly any 
 other widely used implementation of a Function.prototype.bind method. so 
 replacing one with the other wasn't disruptive.
 Could be, but there were differences:
 
 https://mail.mozilla.org/pipermail/es-discuss/2012-January/019382.html
 
 I think you're on thin ice arguing this was so much less signfiicant than 
 Object.extend (or let's say Object.update).

Perhaps.  I think the most important compatibility situation is when the old 
and new names are the same.  For example, Function.prototype.bind or 
Object.extend.  I understood that Function.prototype.bind was more common 
before ES5 than it may have really been.  However, in my reading of the 
MooTools docs (http://mootools.net/docs/core/Types/Function#Function:bind ) it 
sounds very similar to ES5 bind

 
 Object.extends and similar but differently named or  located framework 
 functions are not nearly as well aligned in their core semantics.
 
 First, differently named applied to bind precursors, e.g. Dojo's hitch.
 
 Second, here's a post from jresig years ago:
 
 https://mail.mozilla.org/pipermail/es-discuss/2008-July/006709.html
 
 This is out of date, but note how for-in is used in all cases. There's a lot 
 of common ground here, and some uncommon bits that look not a whole lot 
 bigger or different-in-kind from the bind/hitch/etc. ones we overcame in ES5.

The JSFixed proposal uses getOwnPropertyNames rather than for-in and I can't 
imagine that we would adopt a semantics that copied inherited properties 
similarly to a for-in based implementation.  Similarly, I can't imagine that we 
wouldn't correctly handle accessors.  If a new method has to be polyfillable 
back to ES3 then its semantics needs to be more limited.  A much better job can 
be done if you only have to polyfill for ES5.  But that doesn't really provide 
anything new. If it is important, why isn't somebody in the community 
evangelize  a sound de facto standard ES5 level extend-like function that all 
frameworks could adopt. TC39 isn't necessary for such a thing to be widely 
adopted.

Allen


___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Brendan Eich

Allen Wirfs-Brock wrote:

On Jul 30, 2012, at 2:56 PM, Brendan Eich wrote:

Allen Wirfs-Brock wrote:
The commonly used semantics of ES5's bind did not differ 
significantly any other widely used implementation of a 
Function.prototype.bind method. so replacing one with the other 
wasn't disruptive.

Could be, but there were differences:

https://mail.mozilla.org/pipermail/es-discuss/2012-January/019382.html

I think you're on thin ice arguing this was so much less signfiicant 
than Object.extend (or let's say Object.update).


Perhaps.  I think the most important compatibility situation is when 
the old and new names are the same.  For example, 
Function.prototype.bind or Object.extend.  I understood that 
Function.prototype.bind was more common before ES5 than it may have 
really been.  However, in my reading of the MooTools docs 
(http://mootools.net/docs/core/Types/Function#Function:bind ) it 
sounds very similar to ES5 bind


Read all the docs, not just MooTools. Dojo's hitch takes a string or a 
function (if string, it looks for that method in scope). PrototypeJS 
didn't forward new attempts to the target. Etc.


Object.extends and similar but differently named or  located 
framework functions are not nearly as well aligned in their core 
semantics.


First, differently named applied to bind precursors, e.g. Dojo's hitch.

Second, here's a post from jresig years ago:

https://mail.mozilla.org/pipermail/es-discuss/2008-July/006709.html

This is out of date, but note how for-in is used in all cases. 
There's a lot of common ground here, and some uncommon bits that look 
not a whole lot bigger or different-in-kind from the bind/hitch/etc. 
ones we overcame in ES5.


The JSFixed proposal uses getOwnPropertyNames


We were talking about precedent in libraries, not JSFixed, but ok.

rather than for-in and I can't imagine that we would adopt a semantics 
that copied inherited properties similarly to a for-in based 
implementation.


It may not matter. The rule has been Object.prototype is verboten and 
the pattern generally uses an object (literal, even), not an array whose 
prototype has been extended by assignment, as the source. So no 
proto-pollution occurs in practice.


So I suspect we would be fine spec'ing Object.getOwnPropertyNames. That 
is on the level of the changes made from progenitor bind-like functions, 
in reality (due to the best-practices mentioned above).



 Similarly, I can't imagine that we wouldn't correctly handle accessors.


Right, but the precedents predate ES5 so this is no surprise. It's sort 
of like Prototype's not forwarding new, arguably worse but hard to say.


 If a new method has to be polyfillable back to ES3 then its semantics 
needs to be more limited.  A much better job can be done if you only 
have to polyfill for ES5.  But that doesn't really provide anything new.


Now you're picking a fight. The point is *not* to provide something new 
if the use-case would be met by an API that can be polyfilled -- as many 
use cases can, since the call sites pass *object literals*.


What the API provides is the ability to do without a transpiler. That's 
a big deal.


If it is important, why isn't somebody in the community evangelize  a 
sound de facto standard ES5 level extend-like function that all 
frameworks could adopt.


We did not insist on such a condition when we put bind into ES5. But now 
you are definitely rehashing something I thought we were past: we do not 
require one winner in detail to adopt something. If we did, nothing much 
would get adopted.



TC39 isn't necessary for such a thing to be widely adopted.


That applies to bind-like functions too and it's irrelevant.

/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-30 Thread Rafael Weinstein
On Mon, Jul 30, 2012 at 2:56 PM, Aymeric Vitte vitteayme...@gmail.com wrote:

 Le 28/07/2012 01:55, Rick Waldron a écrit :


 Explanation of specification history and roots in newer DOM mutation
 mechanism.

 AWB: Is this sufficient for implementing DOM mutation event mechanisms?

 RWS: Yes, those could be built on top of Object.observe


 Probably I must be misreading the proposal (again), but if you take a js DOM
 project where almost all attributes are handled via getters/setters, how can
 we observe something ?

The point wouldn't be to observe DOM changes directly via
Object.observe() by user script. The DOM Mutation API is different in
several ways from Object.observe() (different API surface area,
different vocabulary of changes, etc...)

One approach would be have the DOM storage internally be graphs of
simple data. The implementation can observe changes and then compute a
transform of the data changes it receives into the necessary DOM
mutations, which it then broadcasts.

I don't have an opinion of whether it would be a good idea to take
this approach (my guess is that standard trade-offs of complexity 
memory vs speed would apply). Allen's question was whether it would be
possible.


 --
 jCore
 Email :  avi...@jcore.fr
 Web :www.jcore.fr
 Webble : www.webble.it
 Extract Widget Mobile : www.extractwidget.com
 BlimpMe! : www.blimpme.com


 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread David Bruant
Hi,

First and foremost, thanks for the notes :-)

Le 28/07/2012 01:55, Rick Waldron a écrit :
 Fix override mistake

 # The can put check
 (...)

 Property in a prototype object that is read-only cannot be shadowed.

 Just the same as get-only accessor.
I'd like to add a use case here. Every once in a while, I write
something like:

var a = [];
a.push = function(elem){
if(condition(elem)){
// do something like change the elem value then do an actual
push
// or throw an error
// or just ignore this value to avoid duplicates, for instance
}
else{
Array.prototype.push.call(this, elem)
}
};

// use a.push (there is an implicit contract on only using .push to
add elements)

There is such a snippet in a Node.js server in production right now, so
that's really not hypothetical code. If I ever consider to move to SES,
then, before the above snippet is run, Array.prototype gets frozen and
the a.push assignment will fail (at runtime!).


Several things here:
* I could change a.__proto__, but it's a bit weird since the condition
in the custom push is often very specific to this exact array, so
changing the [[prototype]] feels like too much, just for one instance
(though that would work fine)
* I could use Object.defineProperty, but the above code is definitely
more readable and intuitive.
* An implicit contract is not the best idea ever, but that works when
the array. In an ES6 world the array would certainly be a proxy and
whatever invariant could be preserved even for numerical property value
assignments. But we're not there yet, so that's not an option

As far as I'm concerned, the biggest issue with this use case is that I
have written code which reads well (I'm open to debate on that if some
disagree) and that what is read may not be what will occur.
Also, if one day, one Node.js module I use decides it's better to freeze
Array.prototype, it will be a very painful bug to track down when I
update. It would be mush easier to track down if I was monkey-patching
Array.prototype.push, but I'm not.

As a final note, I don't know how often people do what I've describe.
I'll adapt my code if what is decided is to keep the [[canPut]] error,
but I don't know how many people this kind of problem would affect.

David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Herby Vojčík

David Bruant wrote:

Hi,

First and foremost, thanks for the notes :-)

Le 28/07/2012 01:55, Rick Waldron a écrit :

Fix override mistake

# The can put check
(...)

Property in a prototype object that is read-only cannot be shadowed.

Just the same as get-only accessor.

I'd like to add a use case here. Every once in a while, I write
something like:

 var a = [];
 a.push = function(elem){
 if(condition(elem)){
 // do something like change the elem value then do an actual
push
 // or throw an error
 // or just ignore this value to avoid duplicates, for instance
 }
 else{
 Array.prototype.push.call(this, elem)
 }
 };

 // use a.push (there is an implicit contract on only using .push to
add elements)

There is such a snippet in a Node.js server in production right now, so
that's really not hypothetical code. If I ever consider to move to SES,
then, before the above snippet is run, Array.prototype gets frozen and
the a.push assignment will fail (at runtime!).


Several things here:
* I could change a.__proto__, but it's a bit weird since the condition
in the custom push is often very specific to this exact array, so
changing the [[prototype]] feels like too much, just for one instance
(though that would work fine)
* I could use Object.defineProperty, but the above code is definitely
more readable and intuitive.


Well, yes. But from the philosophical PoV, imho, you should do 
Object.defineProperty here, because that is what you do (your intent is 
not put a value to a's push property).


Though not very constructive, I'd say this is the case where

a.{
  push(elem) {
...
  }
};

is definitely missing.


* An implicit contract is not the best idea ever, but that works when
the array. In an ES6 world the array would certainly be a proxy and
whatever invariant could be preserved even for numerical property value
assignments. But we're not there yet, so that's not an option

As far as I'm concerned, the biggest issue with this use case is that I
have written code which reads well (I'm open to debate on that if some
disagree) and that what is read may not be what will occur.
Also, if one day, one Node.js module I use decides it's better to freeze
Array.prototype, it will be a very painful bug to track down when I
update. It would be mush easier to track down if I was monkey-patching
Array.prototype.push, but I'm not.

As a final note, I don't know how often people do what I've describe.
I'll adapt my code if what is decided is to keep the [[canPut]] error,
but I don't know how many people this kind of problem would affect.

David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread David Bruant
Le 28/07/2012 13:43, Herby Vojčík a écrit :
 David Bruant wrote:
  var a = [];
  a.push = function(elem){
  if(condition(elem)){
  // do something like change the elem value then do an
 actual
 push
  // or throw an error
  // or just ignore this value to avoid duplicates, for
 instance
  }
  else{
  Array.prototype.push.call(this, elem)
  }
  };

  // use a.push (there is an implicit contract on only using .push to
 add elements)

 (...)

 * I could use Object.defineProperty, but the above code is definitely
 more readable and intuitive.

 Well, yes. But from the philosophical PoV, imho, you should do
 Object.defineProperty here, because that is what you do (your intent
 is not put a value to a's push property).
My intent is I want a custom 'push' property for this particular
array, because I'm filling the array afterwards using .push calls. I
don't know what I should be doing from a philosophical point of view,
but the code written above describes my intention pretty well. If I saw
a call to Object.defineProperty instead, my first reaction would
certainly be but why isn't a regular assignment used here?. A comment
could be added to explain the [[CanPut]], but that's what I would call
boilerplate comment.

So far, to the general question why is Object.defineProperty used
instead of a regular assignment used here?, the only answer I find
acceptable is defining custom configurable/writable/enumerable,
because these are things local to the code that have no syntax for them.
In most cases, getter/setters can be defined in object literals.
Adding the prototype may be frozen, thus preventing shadowing to the
acceptable answers makes local code review harder.


 Though not very constructive, I'd say this is the case where

 a.{
   push(elem) {
 ...
   }
 };

 is definitely missing.
I remembered that .{ semantics was a [[Put]] semantic, so it wouldn't
solve the problem. Did I remember something wrong?

Arguably, I could use a different name than push. But it doesn't
change the problem:
If I add an 'x' property to my array and later in the history of ES, an
Array.prototype.x property is added, my code will break by virtue of
engines updating... hmm... That's a worse situation than I initially
thought.

David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Herby Vojčík



David Bruant wrote:

Le 28/07/2012 13:43, Herby Vojčík a écrit :

David Bruant wrote:

  var a = [];
  a.push = function(elem){
  if(condition(elem)){
  // do something like change the elem value then do an
actual
push
  // or throw an error
  // or just ignore this value to avoid duplicates, for
instance
  }
  else{
  Array.prototype.push.call(this, elem)
  }
  };

  // use a.push (there is an implicit contract on only using .push to
add elements)

(...)

* I could use Object.defineProperty, but the above code is definitely
more readable and intuitive.

Well, yes. But from the philosophical PoV, imho, you should do
Object.defineProperty here, because that is what you do (your intent
is not put a value to a's push property).

My intent is I want a custom 'push' property for this particular
array, because I'm filling the array afterwards using .push calls. I
don't know what I should be doing from a philosophical point of view,
but the code written above describes my intention pretty well. If I saw


To be precise, [[Put]] and [[DefineProperty]] are different intents. 
Dveelopers may not like it, because they used to [[Put]], but it is 
probably needed to distinguish them.


[[Put]] is high-level contract (a, update your 'push' facet with value), 
[[DefineProperty]] is low-level contract (a, add/update your slot named 
'push' with value).


I am inclined to see [[Put]] used to shadow methods as an abuse of 
high-level interface to do low-level patching.


But of course, unless there is nice sugar, everyone uses [[Put]] since 
it's easier to write (and read).



a call to Object.defineProperty instead, my first reaction would
certainly be but why isn't a regular assignment used here?. A comment
could be added to explain the [[CanPut]], but that's what I would call
boilerplate comment.

So far, to the general question why is Object.defineProperty used
instead of a regular assignment used here?, the only answer I find
acceptable is defining custom configurable/writable/enumerable,
because these are things local to the code that have no syntax for them.
In most cases, getter/setters can be defined in object literals.
Adding the prototype may be frozen, thus preventing shadowing to the
acceptable answers makes local code review harder.


:-/ But that is how it is, no?


Though not very constructive, I'd say this is the case where

a.{
   push(elem) {
 ...
   }
};

is definitely missing.

I remembered that .{ semantics was a [[Put]] semantic, so it wouldn't
solve the problem. Did I remember something wrong?


Of course. Mustache has the same semantics as extended literal, so it 
was [[DefineProperty]] with appropriate enum/conf/writ (and setting home 
context for methods, so in fact it did defineMethod).



Arguably, I could use a different name than push. But it doesn't
change the problem:
If I add an 'x' property to my array and later in the history of ES, an
Array.prototype.x property is added, my code will break by virtue of
engines updating... hmm... That's a worse situation than I initially
thought.

David


Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread David Bruant
Le 28/07/2012 14:37, Herby Vojčík a écrit :
 David Bruant wrote:
 Le 28/07/2012 13:43, Herby Vojčík a écrit :
 David Bruant wrote:
   var a = [];
   a.push = function(elem){
   if(condition(elem)){
   // do something like change the elem value then do an
 actual
 push
   // or throw an error
   // or just ignore this value to avoid duplicates, for
 instance
   }
   else{
   Array.prototype.push.call(this, elem)
   }
   };

   // use a.push (there is an implicit contract on only using
 .push to
 add elements)

 (...)

 * I could use Object.defineProperty, but the above code is definitely
 more readable and intuitive.
 Well, yes. But from the philosophical PoV, imho, you should do
 Object.defineProperty here, because that is what you do (your intent
 is not put a value to a's push property).
 My intent is I want a custom 'push' property for this particular
 array, because I'm filling the array afterwards using .push calls. I
 don't know what I should be doing from a philosophical point of view,
 but the code written above describes my intention pretty well. If I saw

 To be precise, [[Put]] and [[DefineProperty]] are different intents.
I don't understand where you're getting at. Let's try to agree on some
definitions, first:
1) There is my intention which I described above
2) there is the JS VM (set of primitive operations, like [[Put]] and
[[DefineProperty]])
3) there is syntax which is expected to be in between, allowing to
translate intentions (high-level descriptions) into the language
primitive operations.

My definition of intention is a fairly high-level description. As I
said, what I need is the fill my array with push calls. How this
method ended up here is not part of my intent, that's why I could
implement y intention with a new a.__proto__.
Then, there is the syntax. a.push = function(elem){...} expresses very
well my intent: it references the only object for which I want a custom
push, it shows the 'push' name and the assignment with = is part of
the programming culture.

So, the way i see it, [[Put]] and [[DefineProperty]] are not intentions.
They are operations through which I may be able to implement my use
case. As it turns out, both map to one syntactic form. It could not be
the case.

 Dveelopers may not like it, because they used to [[Put]], but it is
 probably needed to distinguish them.

 [[Put]] is high-level contract (a, update your 'push' facet with
 value), [[DefineProperty]] is low-level contract (a, add/update your
 slot named 'push' with value).

 I am inclined to see [[Put]] used to shadow methods as an abuse of
 high-level interface to do low-level patching.
Currently, [[Put]] does shadow prototype methods and the sky hasn't
fallen. The question in debate is whether [[Put]] should shadow when the
prototype is frozen.

 a call to Object.defineProperty instead, my first reaction would
 certainly be but why isn't a regular assignment used here?. A comment
 could be added to explain the [[CanPut]], but that's what I would call
 boilerplate comment.

 So far, to the general question why is Object.defineProperty used
 instead of a regular assignment used here?, the only answer I find
 acceptable is defining custom configurable/writable/enumerable,
 because these are things local to the code that have no syntax for them.
 In most cases, getter/setters can be defined in object literals.
 Adding the prototype may be frozen, thus preventing shadowing to the
 acceptable answers makes local code review harder.

 :-/ But that is how it is, no?
That's what the spec says, but V8 has implemented something else (and I
haven't seen an intention to change this behavior), so what the spec
says doesn't really matter.

David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Herby Vojčík



David Bruant wrote:

Le 28/07/2012 14:37, Herby Vojčík a écrit :

David Bruant wrote:

Le 28/07/2012 13:43, Herby Vojčík a écrit :

David Bruant wrote:

   var a = [];
   a.push = function(elem){
   if(condition(elem)){
   // do something like change the elem value then do an
actual
push
   // or throw an error
   // or just ignore this value to avoid duplicates, for
instance
   }
   else{
   Array.prototype.push.call(this, elem)
   }
   };

   // use a.push (there is an implicit contract on only using
.push to
add elements)

(...)

* I could use Object.defineProperty, but the above code is definitely
more readable and intuitive.

Well, yes. But from the philosophical PoV, imho, you should do
Object.defineProperty here, because that is what you do (your intent
is not put a value to a's push property).

My intent is I want a custom 'push' property for this particular
array, because I'm filling the array afterwards using .push calls. I
don't know what I should be doing from a philosophical point of view,
but the code written above describes my intention pretty well. If I saw

To be precise, [[Put]] and [[DefineProperty]] are different intents.

I don't understand where you're getting at. Let's try to agree on some
definitions, first:
1) There is my intention which I described above
2) there is the JS VM (set of primitive operations, like [[Put]] and
[[DefineProperty]])
3) there is syntax which is expected to be in between, allowing to
translate intentions (high-level descriptions) into the language
primitive operations.


I am getting at philosophical difference between assignment and 
define and I aim at [[Put]] is used wrong to define methods, IOW 
[[Put]] should be used to change state (preferable only for that).


IOW, a.foo = 42; is asking an object to change its state. I would 
underline _asking_ here, it's like it's a kind of .foo= is part of API 
of an object.


Your case of a.push=... is not this kind of API.

(and I know of course there is no state/behaviour distinction nor am I 
calling for one and I also know [[Put]] expands an object with new slots 
if they are not present which can be used to attack sort of an API 
argument; but I still hold to it*)


Whatever, if you still don't understand, don't matter. If I wasn't able 
to get the message through as of yet, I won't be probably able to do it 
by more tries anyway.



David


Herby

* This brings me to the idea of weak prevent-extension which maybe 
could be useful: disallowing [[Put]] on nonexistent slots but allowing 
[[DefineProperty]]. This could be especially useful with objects (that 
is, results of [[Construct]]), so their shape is weakly fixed - it is 
fixed with respect to assignment, but open to low-level tweaking when 
extending with some external mixin-like behaviour etc.
But I can see that this would probably lead to just using 
[[DefineProperty]] everywhere, just in case. Which is not good.

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Brandon Benvie
It seems like you're indicating that changing a property to a value,
presumably a primitive, is somehow different from setting it to a function.
Regardless of anything else, that's not true even in the way you mean it
because a function can have a thunk that contains state and accomplishes
the same thing as setting primitive data type. It just can almost be used
for other non-data things too like methods. There's no way to differentiate
from a naive standpoint though.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Brandon Benvie
Er also, not almost
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Brendan Eich

Brandon Benvie wrote:
It seems like you're indicating that changing a property to a value, 
presumably a primitive, is somehow different from setting it to a 
function.


I read Herby as arguing that overriding a prototype property is 
low-level, so must use low-level Object.defineProperty. Due to setters, 
assignment is in contrast high-level: if you assign to a try to create 
an own property, but there's a prototype setter with same name, the 
setter will be invoked.


I agree with Herby.

The problem for Mark Miller based on SES, not necessarily for David (who 
could change his code), is that extant JS libraries predate ES5 and use 
assignment. Perhaps they predate setters in the wild (first in 
SpiderMonkey over 12 years ago), or the author didn't think of setters.


Mark's SES workaround actually relies on setters not being overridden by 
assignment: before freezing common prototype objects that might 
(according to a clever heuristic) pose a problem, SES enumerates 
properties on them and replaces each with an accessor whose setter 
intentionally shadows (since setters receive the target |this|).


Anyway, I don't think function-valued vs. non-function-valued is the 
issue. It's override-of-data-property (vs. non-override-of-accessor).


/be

Regardless of anything else, that's not true even in the way you mean 
it because a function can have a thunk that contains state and 
accomplishes the same thing as setting primitive data type. It just 
can almost be used for other non-data things too like methods. There's 
no way to differentiate from a naive standpoint though.

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Herby Vojčík



Brandon Benvie wrote:

It seems like you're indicating that changing a property to a value,
presumably a primitive, is somehow different from setting it to a


I never mentioned a primitive, please don't put into my mouth what I did 
not say.



function. Regardless of anything else, that's not true even in the way


It does not depend what the value is at all. Function is as good as 
number or plain object or array or whatever.


The distinction is whether the property is used to store (published) 
state (from the API PoV) (and that state can be anything) or it is more 
an infrastructure of an object. That is, what is the primary API of the 
property name:


1. To hold a (settable) state (so it is primarily read by a.foo and used 
afterwards in various ways)? Then it should be set by assignment.


2. To use otherwise (most often |a.foo(args)|, other such use is maybe 
|if (a.isAnimal)| defined in prototype)? Then it should be set by 
defineProperty; it is not meant to have I am something you should be 
setting by = API.


Most often 1. is enumberable and 2. is non-enumerable. It is more or 
less the same philosophical distinction: between public API and 
private API.



you mean it because a function can have a thunk that contains state and
accomplishes the same thing as setting primitive data type. It just can
almost be used for other non-data things too like methods. There's no
way to differentiate from a naive standpoint though.


Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Brandon Benvie
Sorry, I had just completely misread what you read saying. My fault!
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Allen Wirfs-Brock

On Jul 28, 2012, at 5:37 AM, Herby Vojčík wrote:
 ...
 
 To be precise, [[Put]] and [[DefineProperty]] are different intents. 
 Dveelopers may not like it, because they used to [[Put]], but it is probably 
 needed to distinguish them.
 
 [[Put]] is high-level contract (a, update your 'push' facet with value), 
 [[DefineProperty]] is low-level contract (a, add/update your slot named 
 'push' with value).
 
 I am inclined to see [[Put]] used to shadow methods as an abuse of high-level 
 interface to do low-level patching.
 
 But of course, unless there is nice sugar, everyone uses [[Put]] since it's 
 easier to write (and read).
 

I think there is a very important point here that I hope we don't loose in the 
weeds of this discussion.  The distinction between assignment and definition 
(ie, between [[Put]] and [[DefineOwnProperty]]) was not  very important when 
all ES had was data properties and there was no way for ES code to manipulate 
property attributes. In those pre-ES5 days, [[DefineOwnProperty]] didn't even 
exist and the installation of object literal properties were specified using 
[[Put]] semantics.  In those days, it was fine to think of property definition  
as simply an assignment  (ie the = operator or Put]]) to a unused property name.

However, as soon as we have things like accessor properties, pragmatically 
configurable attributes, methods with super bindings, real inheritance 
hierarchies, classes, etc. the distinction between assignment and definition 
comes much more important.  Continuing to conflate them is going to lead to 
increasing confusion.  The override mistake issues is just the first and 
simplest of the sort of issues that result.  In post ES5,  programmers really 
need to to  learn and use the distinction between property assignment and 
property definition.  To ensure this, we need to provide language features that 
guide them towards this understanding and proper usage.

Herby correctly identifies where we stand right now.  ES developers need and 
want something that approaches the convenience of = for dynamically defining 
properties. As long as we only have a procedural API (Object.defineProperty) 
for dynamic property definition most won't learn the distinction and even those 
that do will frequently ignore it for the sake of convience.  ES6 needs a 
concise and friendly way to dynamically define properties.  The syntax needs to 
approach the connivence of = but it needs to bring emphasis to the distinction 
between assignment and definition.  Without  it,  ES5+ES6 will have 
collectively result in a more confusing and error prone language.

More below...
 a call to Object.defineProperty instead, my first reaction would
 certainly be but why isn't a regular assignment used here?. A comment
 could be added to explain the [[CanPut]], but that's what I would call
 boilerplate comment.
 
 So far, to the general question why is Object.defineProperty used
 instead of a regular assignment used here?, the only answer I find
 acceptable is defining custom configurable/writable/enumerable,
 because these are things local to the code that have no syntax for them.
 In most cases, getter/setters can be defined in object literals.
 Adding the prototype may be frozen, thus preventing shadowing to the
 acceptable answers makes local code review harder.
 
 :-/ But that is how it is, no?
 
 Though not very constructive, I'd say this is the case where
 
 a.{
   push(elem) {
 ...
   }
 };
 
 is definitely missing.
 I remembered that .{ semantics was a [[Put]] semantic, so it wouldn't
 solve the problem. Did I remember something wrong?
 
 Of course. Mustache has the same semantics as extended literal, so it was 
 [[DefineProperty]] with appropriate enum/conf/writ (and setting home context 
 for methods, so in fact it did defineMethod).

I still think  a dynamic property definition syntax can be based on something 
like mustache. Two months ago,  there was interest at the TC39 meeting in 
further exploring mustache.  Some of that interest was motivated by these 
definitional issues.  However, our attempt to do this crashed and burned badly 
because we tried to accommodate a desire to make the same syntactic construct 
also serve as a cascade.  However, cascades require [[Put]]/[[Get]] semantics 
and this is in directly conflict with the requirements of dynamic property 
definition. Confusion about this is reflected in the quotes immediately above.  
We should have recognized before we even tried, that trying to combine those 
two semantics just won't work.

However, here is the sketch of a proposal for something that might work.

We introduce a new operator that is looks like  :=

This is the define properties operator.  Both the LHS and RHS must be objects 
(or ToObject convertible).  Its semantics is to [[DefineOwnProperty]] on the 
LHS obj a property corresponding to each RHS own property.  I does this with 
all reflectable own properties. It includes non-enumerable properties 

Re: July 25, 2012 - TC39 Meeting Notes

2012-07-28 Thread Rick Waldron
On Sat, Jul 28, 2012 at 9:04 PM, Allen Wirfs-Brock al...@wirfs-brock.comwrote:


 snip


I snipped, but I agree with all of your claims. While evangelizing our
intention to try for a .{} that supported [[Put]] and [[DefineOwnProperty]].

Given something like this...

.{
  a: apple,
  b = banana
};

The number one resistance to the mixed use use of : and = was that most
developers did not realize there was a semantic difference and actually
expected us to assume the burden of specifying the magic that would make
this work correctly with just :

I submit the following survey results to support the above claim,
https://docs.google.com/spreadsheet/ccc?key=0Ap5RnGLtwI1RdDN3dm92aVJwWEZCMEU3RUN5OTdRTWc


(The live form with the survey question is here:
https://docs.google.com/spreadsheet/viewform?formkey=dDN3dm92aVJwWEZCMEU3RUN5OTdRTWc6MQ)


Pay specific attention to the comments where object literal syntax is
frequently suggested as preferential.

...more below



 We introduce a new operator that is looks like  :=


I like this.



 This is the define properties operator.  Both the LHS and RHS must be
 objects (or ToObject convertible).  Its semantics is to
 [[DefineOwnProperty]] on the LHS obj a property corresponding to each RHS
 own property.  I does this with all reflectable own properties. It includes
 non-enumerable properties and unique named properties but not
 non-reflectable *private* name properties.  It rebinds methods with super
 bindings to the RHS to new methods that are super bound to the LHS.

 The above example would then be written as:

 a := {
   push(elem) {
 ...
   }
 };

 rather than, perhaps incorrectly as:

 a.push = function (elem) {
 ...
 };

 or, correctly but very inconveniently as:

 Object.defineProperty(a, push, {writable: true, configurable: true,
 enumberable: true,
 data:function (elem) {
 ...
 }
 }
 );


Is there a mechanism for customizing writable: true, configurable: true,
enumberable: true?



 Note that while the above example uses an object literal as the RHS, it
 could be any object.  So, := is essentially a operator level definition of
 one plausible semantics for a Object.extend function. Using an operator has
 usability advantages and it also makes it easier to optimize the very
 common case where the RHS will be a literal.

 := is used because it is suggestive of both property definition (the use
 of : in object literals) and of assignment (the = operator).  := also has a
 long history of use as an assignment-like operator in programming
 languages. The visual similarity of = and := should push ES programmers to
 think about then as situational alternatives whose semantic differences
 must be carefully considered.  The simple story is that one is used for
 assigning a value to an existing property and the other is used to define
 or over-ride the definition of properties.

 I really think in a language where we have both [[Put]] and
 [[DefineOwnProperty]] semantics that we really need both = and :=


As noted above, there I feel there is sufficient evidence that supports the
existing confusion and I agree that a syntactic distinction would help
reshape understanding as we move forward.


Rick



 Allen












 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss


___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss