Re: Block lambda is cool, its syntax isn't

2012-01-22 Thread Gavin Barraclough
As well as pragmas, macro-expansion would be a possibility!define fn functionundef fnG.On Jan 20, 2012, at 6:40 PM, Brendan Eich wrote:



   	   
   	Allen Wirfs-Brock  
  January 20, 2012 
5:41 PM
  At the TC39 meeting we
 were trying to think of pragma candidates.It 
wouldn't surprise me if JS programmer would happily trade one:use
 fn;per file in exchange for being able to use
 "fn" as a synonym for "function". In terms of character counts, you 
come out ahead starting with the second function definition.Allen


I have considered this in the past. It always seemed too little, 
due to return. If it 
enabled another production:

 AssignmentExpression :
 fn Identifieropt ( FormalParameterListopt ) AssignmentExpression

(or we just added this unconditionally, without the pragma -- but 
the pragma is good too) then I'd be happy, finally.
  
/be
  
___es-discuss mailing listes-discuss@mozilla.orghttps://mail.mozilla.org/listinfo/es-discuss
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Herby Vojčík

Brendan Eich wrote:

Herby Vojčík mailto:he...@mailbox.sk
January 21, 2012 2:21 PM
Oh. I favor 1. Inspired by latest notes and for(let...) I would see
{
private foo;
...
}
desugared to
{
let foo = _the_real_foo;


Er, const, I hope -- not let. Right?

Of course. My mistake.


/be

Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Herby Vojčík

Brendan Eich wrote:

And _the_real_foo should be expanded:

{
private foo;
...
}

desugars to

{
const foo = Name.create(foo);
...
}

with Name.create imported appropriately.


No, I though more in lines of:

// singleton code, in module level / program level
// generated from all private occurences:
const __prvTable__ = Name.create();
// __prvTable__ being hardwired somehow so it does not clash
// and of course hidden from user
@__prvTable__ = [
  Name.create(),
  Name.create(),
  ... // n times when n is number of privates
];

// and the bocks desugared to
{
  const foo = module.@__prvTable__[42];
  // module.@__prvTable__ is just a hint for
  // somehow, get to the private table
  // it is up to implementation
  // 42 is just the example index
  ...
}

with Name.create imported appropriately.

/be

Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Andrea Giammarchi
process.nextTick already landed in browsers, as concept, it's called
setImmediate

http://msdn.microsoft.com/en-us/library/ie/hh673556(v=vs.85).aspx#setimmediate

About Rhino, that implementation is not the equivalent of what we have in
browsers, where the most useful thing ever is rarely used out there even if
completely cross browsers but IE where de-facto standard is shim amble
without problems.

I am talking about extra arguments any setInterval/Timeout accept, i.e.

setTimeout(callback, 1, some, thing);

where callback is

function callback(some, thing) { ... }

and arguments will be sent when the timeout is burned.

Here fully node.js compatible Rhino timers I have used in wru:
https://github.com/WebReflection/wru/blob/master/src/rhinoTimers.js


Said that, the fact node.js and browsers are sometimes way too much
misaligned is something I have blogged about already and in this case
node.js does not even return an Int32 as any browsers does, it returns an
object, still unique, but not an Number.

It would be nice to have these methods well defined across all platforms

br


On Sun, Jan 22, 2012 at 6:19 AM, Brandon Benvie
bran...@brandonbenvie.comwrote:

 Absolutely agree. I don't see a place for Node's 1ms resolution in
 browsers, which was the impetus for raising the issue. I see a place for
 Node (and other non-browser platforms) to implement their own host timers
 that provide higher resolution (In fact Node's process.nextTick(callback)
 is a good example of host functionality that's useful but wouldn't belong
 in a JS spec). But the point is that the lack of specification has already
 resulted in incompatible implementations of ostensibly the same basically
 required core language functionality.
 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss


___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Fork of classes proposal, @ for private members, class as operator

2012-01-22 Thread Herby Vojčík
I forked the classes proposal and made changes I would really like to 
see there. The proposal is here:


  http://dw.herby.sk/doku.php?id=es.next:classes.herby

and the diffs (side-by-side and inline) to the actual proposal are here:


http://dw.herby.sk/doku.php?id=es.next:classes.herbyrev=1327097746do=diffdifftype=sidebyside

http://dw.herby.sk/doku.php?id=es.next:classes.herbyrev=1327097746do=diffdifftype=inline

Main changes:
  - private(expr) not used, foo.@bar private-name syntax instead
  - private keyword as a shortcut to declare private names
  - class is an operator on full-fledged object literal
  - | used for inheritance
  - parallel prototype-prototype and constructor-constructor chains
  - shorter (class body definition and private(foo) section removed)

Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Questions on Map and Set based on SpiderMonkey implementation

2012-01-22 Thread David Bruant
Hi,

SpiderMonkey implementation landed today. I was looking over the commit
[1] and had a question.
There is a test:

assertEq(Object.prototype.toString.call(new Map), [object Map]);

I can't see anything about it neither in the latest spec draft nor in
the wiki proposal. Is it how Object.prototype.toString should behave. I
assume yes, but since there is no mention somewhere, it's probably worth
discussing it.
Likewise for Set.

Also, what is the benefit of Map.prototype being a map and Set.prototype
being a set? This is the kind of things that are true for all object
types and apparently, it has bitten (Date.prototype and
WeakMap.prototype offer a communication channel for potential attackers)
more than being helpful.

David

[1] https://hg.mozilla.org/mozilla-central/rev/6a5e20a0f741
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Questions on Map and Set based on SpiderMonkey implementation

2012-01-22 Thread Herby Vojčík

David Bruant wrote:

Hi,

SpiderMonkey implementation landed today. I was looking over the commit
[1] and had a question.
There is a test:

assertEq(Object.prototype.toString.call(new Map), [object Map]);

I can't see anything about it neither in the latest spec draft nor in
the wiki proposal. Is it how Object.prototype.toString should behave. I
assume yes, but since there is no mention somewhere, it's probably worth
discussing it.
Likewise for Set.


It is how Object.prototype.toString always (at least since ES5) behaved. 
It was in the spec. The newest spec has:


15.2.4.2 Object.prototype.toString ( )
When the toString method is called, the following steps are taken:
1. If the this value is undefined, return [object Undefined].
2. If the this value is null, return [object Null].
3. Let O be the result of calling ToObject passing the this value as the 
argument.
4. If O has a [[NativeBrand]] internal property, let tag be the 
corresponding value from the Table 23.

5. Else, let tag be the string value Object.
6. Return the String value that is the result of concatenating the three 
Strings [object , tag, and ].


And the Table23 that follows contains strings for native object types, 
like Number, Math, Array, JSON etc. There is nothing for Map not Set, 
but there is no Map or Set there in the first place (or I failed in 
searching them).



David

[1] https://hg.mozilla.org/mozilla-central/rev/6a5e20a0f741


Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Non-extensible WeakMaps

2012-01-22 Thread Tab Atkins Jr.
On Sun, Jan 22, 2012 at 10:28 AM, David Bruant bruan...@gmail.com wrote:
 Hi,

 In Firefox Aurora as well as in Chromium 18, running the following
 -
 var wm = new WeakMap();
 var o = {};

 Object.preventExtensions(wm);

 wm.set(o, 1);
 console.log(wm.get(o)); // 1
 -

 Is this something that is wanted?
 Same question for Maps and Sets.

Calling set() on a WeakMap doesn't add any properties to the WeakMap
object, so yes, it's expected that preventExtensions has no effect.

~TJ
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Non-extensible WeakMaps

2012-01-22 Thread David Bruant
Le 22/01/2012 20:16, Tab Atkins Jr. a écrit :
 On Sun, Jan 22, 2012 at 10:28 AM, David Bruant bruan...@gmail.com wrote:
 Hi,

 In Firefox Aurora as well as in Chromium 18, running the following
 -
 var wm = new WeakMap();
 var o = {};

 Object.preventExtensions(wm);

 wm.set(o, 1);
 console.log(wm.get(o)); // 1
 -

 Is this something that is wanted?
 Same question for Maps and Sets.
 Calling set() on a WeakMap doesn't add any properties to the WeakMap
 object, so yes, it's expected that preventExtensions has no effect.
I agree that Object.preventExtensions is defined as preventing addition
of new properties. Likewise, Object.freeze and Object.seal only act on
object properties (extended to private properties?).
But the broader problem they are addressing is reducing the mutability
of objects. WeakMaps, maps and sets bring a new form of mutability which
cannot be implemented in the form of private properties (I think at
least). So the question is:

Should Object.preventExtensions be extended to reduce WeakMaps, Maps and
Sets mutability? Likewise for Object.seal|freeze?

David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Non-extensible WeakMaps

2012-01-22 Thread Herby Vojčík

David Bruant wrote:

I agree that Object.preventExtensions is defined as preventing addition
of new properties. Likewise, Object.freeze and Object.seal only act on
object properties (extended to private properties?).
But the broader problem they are addressing is reducing the mutability
of objects. WeakMaps, maps and sets bring a new form of mutability which
cannot be implemented in the form of private properties (I think at
least). So the question is:

Should Object.preventExtensions be extended to reduce WeakMaps, Maps and
Sets mutability? Likewise for Object.seal|freeze?


It would be special case. I'd say no.
Collections should proabably get their own API for this, analogic to 
Object.preventExtension|seal|freeze.


Where to put it, is the question. It should probably be callable like 
Map.seal(aMap), Array.freeze(anArray) etc. If a collection hierarchy has 
common ancestor, constructor function can inherit in parallel with 
prototypes, so it may in fact reside in that(ose) common base classes.



David


Herby

P.S.: Array.xxx version should be able to work only on indexed elements, 
not or other properties (it is Object.xxx's work). Of course them same 
for Map etc., but there it is natural.

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Fixing instanceof (Array.isArray() etc.)?

2012-01-22 Thread Brendan Eich

Axel Rauschmayer mailto:a...@rauschma.de
January 21, 2012 10:13 PM
Duck typing is an interesting consideration. If one could express

 x follows ArraySpec

then that would work for both array-like objects and Array instances 
from other frames.


You're talking about contracts here. Not ready for standardization, but see

http://disnetdev.com/contracts.coffee/

/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Deep cloning objects defined by JSON.

2012-01-22 Thread Jake Verbaten
Most of the time the purpose of deep cloning objects is deep cloning data
structures.

It's been discussed that generically deep cloning proxies, privates and
functions is a non-trivial problem.

However it would be of value to have a mechanism to deep clone anything
that would be valid JSON (Limiting to JSON is arbitary, it's a well defined
subset and none of the subset would involve difficult points to resolve).

This gives us

 - An efficient deep clone implementation in the js engine
 - Solves difficulties with deep cloning by disallowing difficult objects
from being deep cloned.
 - gets rid of every clone function that every library has.

I presume this would literally be a highly optimised version of
JSON.parse(JSON.stringify(o)).

Potential issues

 - subset of JSON is too restricted to be useful
 - Proxies/private state may cause issues (the same issues would apply to
JSON.stringify ?)
 - What's the value of the [[Prototype]] of the clone? (JSON.parse uses the
standard [[Prototype]] for the stringified object)
 - Do we expect anything sensible to happen with host objects? (JSON.parse
returns objects with few or no properties for host objects)
 - Do we solve cyclic references? (JSON.parse fails on them)
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Brendan Eich

Brandon Benvie mailto:bran...@brandonbenvie.com
January 21, 2012 9:59 PM
Sorry to spam this thread but I wanted to get the relevent points in 
up front:


'Actually, wait a minute -- I think I disagree with you here.


On what? Being past the deadline? Not rushing a de-jure standard before 
we have synthesized the right semantics from relevant JS embeddings?


Spec the unofficial agreement, including the minimal(/maximal if it 
exists) time constraints, and go from there. This is needed.


Why? What goes wrong if we go light on execution model one more time? I 
think nothing.


But in fact we are going to get a little more into execution model in 
ES6. How much remains to be seen. We discussed it at last week's meeting.


But this is not an all-or-nothing proposition, and I do not see the 
do-or-die requirement. Reality is what it is. HTML5 captures a lot. 
Node.js conforms. ES6 saying more doesn't alter these facts.


/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Brendan Eich
I suspect your very complicated translation with __prvTable__ etc. is 
intended to hoist private somehow, once per declaration in source rather 
than once per evaluation of declaration. That's too restrictive, since 
private declarations can be placed in outer blocks or closures for 
singleton naming, or moved into inner constructor-like functions for 
per-instance private names.


Users should be able to declare class-private and instance-private 
names, in other words.


Your block examples, which I modified, are not complete enough to judge 
what's wanted. By asserting singleton private name per source 
declaration you are deciding prematurely and overconstraining the 
feature. Let the user put the block or closure at the right inner or 
outer level and declare there. There are an arbitrary number of 
generative layers (generations): class static private, class instance 
private, inner closure private, etc. etc.


/be


Herby Vojčík mailto:he...@mailbox.sk
January 22, 2012 10:16 AM
Brendan Eich wrote:

And _the_real_foo should be expanded:

{
private foo;
...
}

desugars to

{
const foo = Name.create(foo);


BTW, would this not mean it is different in every run?


...
}

with Name.create imported appropriately.

/be


Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

Brendan Eich mailto:bren...@mozilla.org
January 21, 2012 4:41 PM

And _the_real_foo should be expanded:

  {
private foo;
...
  }

desugars to

  {
const foo = Name.create(foo);
...
  }

with Name.create imported appropriately.

/be
Brendan Eich mailto:bren...@mozilla.org
January 21, 2012 4:39 PM

Er, const, I hope -- not let. Right?

/be
Herby Vojčík mailto:he...@mailbox.sk
January 21, 2012 2:21 PM
Brendan Eich wrote:

Herby Vojčík mailto:he...@mailbox.sk
January 21, 2012 1:56 PM
Brendan Eich wrote:

private foo;

@foo = bar; // this-relative private foo

return @foo === other.@foo;

return {@foo: bar};


This helps a lot, but there still _is_ an
identifier foo having that private name in its value.


This was not decided, as far as I know. There are two choices:

1. private foo; defines a lexical binding used to denote the private
name object, as well as after @ to use it to access a property in an
object.

2. Rather, the *only* places foo would be allowed after private foo;
above are those after an @. IOW it would be fine to use let foo = 42;
and private foo; without conflict. Some further syntax, a la the old
#.foo proposal (obsoleted in terms of # now), would be required to
reflect foo from lexical-to-the-right-of-@ space into a first-class
private name object reference.


Oh. I favor 1. Inspired by latest notes and for(let...) I would see
  {
private foo;
...
  }
desugared to
  {
let foo = _the_real_foo;
...
  }
where _the_real_foo is defined somewhere at the module or program 
level such that it will not clash (hardwired private name or index to 
a table or whatever) and the rest is just reusing existing rules.


2. is too magical (for me).


/be


Herby

Brendan Eich mailto:bren...@mozilla.org
January 21, 2012 2:11 PM

Herby Vojčík mailto:he...@mailbox.sk
January 21, 2012 1:56 PM
Brendan Eich wrote:

Herby Vojčík mailto:he...@mailbox.sk
January 21, 2012 1:33 PM
Brendan Eich wrote:
http://wiki.ecmascript.org/doku.php?id=strawman:private_names#private_declarations_exist_in_a_separate_name_space_parallel_to_the_variable_binding_environment 



The last really was too much for some folks. It makes the meaning 
of an
identifier after . or before : in an object literal depend on a 
binding

declaration, possibly far above.


Thank you. I did not know of these. The problem in the third one (and
the solution) are really crazy... I would do the early error if there
would be a clash (akin to double let).


The way to resolve the two-lexical-binding-chains issue for private
declarations is not to overload . (member expression; also : in object
literals), by requiring @ instead:

private foo;

@foo = bar; // this-relative private foo

return @foo === other.@foo;

return {@foo: bar};


This helps a lot, but there still _is_ (I only proposed a convenient 
shortcut, not some magic special names for private names) an 
identifier foo having that private name in its value.


This was not decided, as far as I know. There are two choices:

1. private foo; defines a lexical binding used to denote the private 
name object, as well as after @ to use it to access a property in an 
object.


2. Rather, the *only* places foo would be allowed after private foo; 
above are those after an @. IOW it would be fine to use let foo = 
42; and private foo; without conflict. Some further syntax, a la 
the old #.foo proposal (obsoleted in terms of # now), would be 
required to reflect foo from lexical-to-the-right-of-@ space into a 
first-class private name object reference.


So it _would_ clash if foo was defined in code. But I believe 

Re: Non-extensible WeakMaps

2012-01-22 Thread Brendan Eich

Herby Vojčík mailto:he...@mailbox.sk
January 22, 2012 11:42 AM
David Bruant wrote:

I agree that Object.preventExtensions is defined as preventing addition
of new properties. Likewise, Object.freeze and Object.seal only act on
object properties (extended to private properties?).


No, we agreed the property visiting under freeze and seal does *not* 
affect private-object-named properties.



But the broader problem they are addressing is reducing the mutability
of objects. WeakMaps, maps and sets bring a new form of mutability which
cannot be implemented in the form of private properties (I think at
least). So the question is:

Should Object.preventExtensions be extended to reduce WeakMaps, Maps and
Sets mutability? Likewise for Object.seal|freeze?


It would be special case. I'd say no.


I agree so far as this goes.

Mark has thought deeply about this topic, with the use-case of 
preventing extensions being the closing of covert or side channels in JS 
objects. For private names and weak maps, there is no channel to close 
via preventExtensions, since the attacker by definition doesn't have the 
key. For Maps and Sets, which support enumeration, the thread model is 
different.


/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Rick Waldron

 Potential issues

  - subset of JSON is too restricted to be useful


This alone seems like a deal-breaker/non-starter. How would you copy
methods? Forgetting about cyclic reference exceptions for a moment:


var o = {
  s: string,
  n: 1,
  a: [ 1, 2, 3, 4 ],
  o: {
method: function( prop ) {
  return stuff;
},
n: null,
u: undefined
  },
  bt: true,
  bf: false
},
clone = JSON.parse(JSON.stringify(o));

 clone

{
  s: 'string',
  n: 1,
  a: [ 1, 2, 3, 4 ],
  o: {
n: null
  },
  bt: true,
  bf: false
}


While it has the benefit of losing all of its references to the original
object, it also lost any methods or initialized but unassigned (undefined)
properties. Security concern trumps the inclusion of methods in valid JSON

Rick





  - Proxies/private state may cause issues (the same issues would apply to
 JSON.stringify ?)
  - What's the value of the [[Prototype]] of the clone? (JSON.parse uses
 the standard [[Prototype]] for the stringified object)
  - Do we expect anything sensible to happen with host objects? (JSON.parse
 returns objects with few or no properties for host objects)
  - Do we solve cyclic references? (JSON.parse fails on them)

 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss


___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Herby Vojčík

Brendan Eich wrote:

I suspect your very complicated translation with __prvTable__ etc. is
intended to hoist private somehow, once per declaration in source rather
than once per evaluation of declaration. That's too restrictive, since


Exactly. By design.


private declarations can be placed in outer blocks or closures for
singleton naming, or moved into inner constructor-like functions for
per-instance private names.


I proposed private to really only do the conventional per-source 
singleton aliasing (I dubbed 'const foo = perSourceTable[index]' 
aliasing for this discussion).


The motivation for this is exactly the can be placed in outer blocks or 
closures. My proposal for 'private' is precisely and only to create 
conventional way to do these singletons without need for those annoying 
wrappers.


Maybe I should include the example to illustrate how hard and 
inconcievable it is to create such singleton for use in a few { levels 
deeper, going there and back, creating them more for many nested levels 
in one outer place, and so forth, but I believe you can imagine.



Users should be able to declare class-private and instance-private
names, in other words.


Instance-private:
These cases do not need any special syntax. They can use const foo = 
Name.create(). It is very little hassle. I proposed 'private' only for 
singletons, because for them the convenience is really useful.


Class-private:
Use singleton in class block scope.


Your block examples, which I modified, are not complete enough to judge
what's wanted. By asserting singleton private name per source
declaration you are deciding prematurely and overconstraining the
feature. Let the user put the block or closure at the right inner or


It was meant as a convenience only for singletion, as they are the hard 
case. The other cases are easy without need for help from 'private'.


So user can put the local case there, by normal means.
Or he can put the hard case (singletion with wrapper) with conenient 
'private'.



outer level and declare there. There are an arbitrary number of
generative layers (generations): class static private, class instance
private, inner closure private, etc. etc.


And I say (and hopefully I am not mistaken) that those runtime-local are 
easy to do by playing with let, const and Name.create(); and those 
lexically-local are hard so the convenience is placed especially for them.


Also, I'd say, the semantics of private can then be straightforward: for 
_any_ {...} block, the private key is only created once and visible only 
in the block. Never mind what block will it be; you have the spatially 
local temporally shared key for your use.



/be


Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Rick Waldron
This is a perfect use case for the forth-coming module system (similar to
the way Globalization is being developed). Dave Herman and I had a brief
over Twitter exchange that began with my desire for a migration of
parseInt and parseFloat to Math, which I followed with a suggestion to do
the same with setTimeout and setInterval (despite those currently existing
in the realm of DOM APIs) to the imaginary Timer object.

Rick



On Sun, Jan 22, 2012 at 3:00 PM, Brendan Eich bren...@mozilla.org wrote:

 Brandon Benvie mailto:brandon@brandonbenvie.**combran...@brandonbenvie.com
 
 January 21, 2012 9:59 PM

 Sorry to spam this thread but I wanted to get the relevent points in up
 front:

 'Actually, wait a minute -- I think I disagree with you here.


 On what? Being past the deadline? Not rushing a de-jure standard before we
 have synthesized the right semantics from relevant JS embeddings?


  Spec the unofficial agreement, including the minimal(/maximal if it
 exists) time constraints, and go from there. This is needed.


 Why? What goes wrong if we go light on execution model one more time? I
 think nothing.

 But in fact we are going to get a little more into execution model in ES6.
 How much remains to be seen. We discussed it at last week's meeting.

 But this is not an all-or-nothing proposition, and I do not see the
 do-or-die requirement. Reality is what it is. HTML5 captures a lot. Node.js
 conforms. ES6 saying more doesn't alter these facts.


 /be
 __**_
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/**listinfo/es-discusshttps://mail.mozilla.org/listinfo/es-discuss

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Non-extensible WeakMaps

2012-01-22 Thread David Bruant
Le 22/01/2012 21:11, Brendan Eich a écrit :
 Herby Vojčík mailto:he...@mailbox.sk
 January 22, 2012 11:42 AM
 David Bruant wrote:
 I agree that Object.preventExtensions is defined as preventing addition
 of new properties. Likewise, Object.freeze and Object.seal only act on
 object properties (extended to private properties?).

 No, we agreed the property visiting under freeze and seal does *not*
 affect private-object-named properties.
Ok.

 But the broader problem they are addressing is reducing the mutability
 of objects. WeakMaps, maps and sets bring a new form of mutability
 which
 cannot be implemented in the form of private properties (I think at
 least). So the question is:

 Should Object.preventExtensions be extended to reduce WeakMaps, Maps
 and
 Sets mutability? Likewise for Object.seal|freeze?

 It would be special case. I'd say no.

 I agree so far as this goes.

 Mark has thought deeply about this topic, with the use-case of
 preventing extensions being the closing of covert or side channels in
 JS objects. For private names and weak maps, there is no channel to
 close via preventExtensions, since the attacker by definition doesn't
 have the key.
I'm not sure to understand what you're calling /the/ key.
WeakMap.prototype, being itself a weakmap, does provide a covert channel
(since everyone has access to the same primordial objects identities).
Of course, it can be repaired [1], but it's quite unfortunate to have to
repair a feature that isn't part of a standard yet, isn't it?

I'm seeing that it is suggested [2] that WeakMap.prototype.set(k,v) and
WeakMap.prototype.delete(k) throw a TypeError (instead of trying to
mutate WeakMap.prototype). Instead of WeakMap.prototype being a
non-mutable weakmap, what about it being not a weakmap at all? (and
throw for .has and .get as well)

 For Maps and Sets, which support enumeration, the thread model is
 different.
Allowing any type to be a key also changes the threat model, because it
means that prior arrangement can be enough to communicate.

However, besides the {WeakMap|Map|Set}.prototype, I do not see anything
that cannot be controlled with a revocable caretaker.

David

[1]
http://code.google.com/p/es-lab/source/browse/trunk/src/ses/initSES.js#2237
[2] https://bugzilla.mozilla.org/show_bug.cgi?id=656828
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Jake Verbaten
On Sun, Jan 22, 2012 at 8:29 PM, Rick Waldron waldron.r...@gmail.comwrote:

 Potential issues

  - subset of JSON is too restricted to be useful


 This alone seems like a deal-breaker/non-starter. How would you copy
 methods? Forgetting about cyclic reference exceptions for a moment:


The idea here is that methods do not belong in data structures (clone
should be to efficiently clone data). a possible solution would be allow
you to set the [[Prototype]] of the returned clone through the API somehow
and then store methods on prototypes.

It does gain the benefit of not having to document the edge-case behaviour
for cloning methods. It would presumably also be an API that can
efficiently clone the new binary data types. The main purpose is efficient
in memory copies of data and not generic cloning of things.

If we add a clone, we probably want to add support for cloning binary data
types to the list as well.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Jorge
On 22/01/2012, at 21:00, Brendan Eich wrote:
 Brandon Benvie mailto:bran...@brandonbenvie.com
 January 21, 2012 9:59 PM
 Sorry to spam this thread but I wanted to get the relevent points in up 
 front:
 
 'Actually, wait a minute -- I think I disagree with you here.
 
 On what? Being past the deadline? Not rushing a de-jure standard before we 
 have synthesized the right semantics from relevant JS embeddings?
 
 Spec the unofficial agreement, including the minimal(/maximal if it exists) 
 time constraints, and go from there. This is needed.
 
 Why? What goes wrong if we go light on execution model one more time? I think 
 nothing.
 
 But in fact we are going to get a little more into execution model in ES6. 
 How much remains to be seen. We discussed it at last week's meeting.
 
 But this is not an all-or-nothing proposition, and I do not see the do-or-die 
 requirement. Reality is what it is. HTML5 captures a lot. Node.js conforms. 
 ES6 saying more doesn't alter these facts.

Now isn't that ~ the opposite of what you said on 2011-03-18 in David Bruants' 
Bringing setTimeout to ECMAScript thread ?

quote
Add to that the fact that Netscape and Microsoft failed, or chose not to, 
standardize the DOM level 0, and we have the current split where setTimeout is 
in HTML5 but the core language is embedded with increasing success in 
non-browser, no-DOM host environments *that want setTimeout*.

I'm open to Ecma TC39 absorbing setTimeout and the minimum machinery it 
entrains. We should ping Hixie.
/quote

Why ?
What has changed ?

P.S.
Node.js does *not* conform. Not at all. Not only it doesn't clamp to 4ms (which 
happens to be a good thing, IMO), but its timers often fire out of order !
-- 
Jorge.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Rick Waldron
On Sun, Jan 22, 2012 at 4:05 PM, Jake Verbaten rayn...@gmail.com wrote:

 On Sun, Jan 22, 2012 at 8:29 PM, Rick Waldron waldron.r...@gmail.comwrote:

 Potential issues

  - subset of JSON is too restricted to be useful


 This alone seems like a deal-breaker/non-starter. How would you copy
 methods? Forgetting about cyclic reference exceptions for a moment:


 The idea here is that methods do not belong in data structures (clone
 should be to efficiently clone data).


This is already too much unfortunate restriction.

What about calculated get properties:

 var o = {
...   get foo() {
... return foo;
...   }
... },
... clone = JSON.parse(JSON.stringify(o));

 clone
{ foo: 'foo' }




 a possible solution would be allow you to set the [[Prototype]] of the
 returned clone through the API somehow and then store methods on prototypes.

 It does gain the benefit of not having to document the edge-case behaviour
 for cloning methods. It would presumably also be an API that can
 efficiently clone the new binary data types. The main purpose is efficient
 in memory copies of data and not generic cloning of things.

 If we add a clone, we probably want to add support for cloning binary data
 types to the list as well.

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Jake Verbaten

 The idea here is that methods do not belong in data structures (clone
 should be to efficiently clone data).


 This is already too much unfortunate restriction.

 What about calculated get properties:

  var o = {
 ...   get foo() {
 ... return foo;
 ...   }
 ... },
 ... clone = JSON.parse(JSON.stringify(o));

  clone
 { foo: 'foo' }


I don't know what the sensible choice here is. Could be either way.

Your right, restrictions are annoying.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: The global object as the global scope instance object

2012-01-22 Thread Gavin Barraclough
On Jan 20, 2012, at 4:23 PM, Allen Wirfs-Brock wrote:
 I'm not yet convinced that you need to publicly expose such global object 
 properties accessor properties.  I think it is possible to hide the 
 mechanisms.

I'm strongly inclined to agree with this sentiment – engine implementors might 
choose internally to utilize accessors in their implementations, but from a 
user perspective it seems to make much more sense that these remain as regular 
data properties.  It is worth bearing in mind that 'const' is already available 
outside of strict mode and we need to be considerate of any breaking changes we 
make to existing semantics.  (We could define incompatible semantics for 
strict-mode only, but this would likely lead to a confusing divergence in 
semantics for users – it could be horribly confusing if const variables in 
non-strict code were to be implemented as data properties whilst those in 
strict code were defined as accessor properties).

alert('value' in Object.getOwnPropertyDescriptor(this, 'x'));
alert(Object.getOwnPropertyDescriptor(this, 'x').writable === true);

alert(Object.getOwnPropertyDescriptor(this, 'x').value);
x = 1;
alert(Object.getOwnPropertyDescriptor(this, 'x').value);
const x = 2;
alert(Object.getOwnPropertyDescriptor(this, 'x').value);
x = 3;
alert(Object.getOwnPropertyDescriptor(this, 'x').value);

This script tests the presence and writability of a property on the global 
object, and the effect of attempting to assign to it in non-strict code.  It 
demonstrates a non-const var behaving as a regular non-writable property in 
non-strict code.

On FireFox the above script outputs true, false, undefined, undefined, 2, 2, 
which to my mind makes a lot of sense.  It perfectly matches, as far as one can 
test, any other regular non writable property that you could create on an 
object (of course there it not normally the opportunity to split creation and 
initialization of a non-writable property).

This behaviour is also sensibly consistent with that for let and var.  Running 
the above script, changing 'const' to 'var' outputs true, true, undefined, 1, 
2, 3 (indicating the property is writable, and all assignments succeed), and 
for 'let' currently also outputs true, true, undefined, 1, 2, 3 on FireFox.  
To my mind this behaviour should probably change slightly.  With a temporal 
dead zone implemented I would expect an attempt to [[DefineOwnProperty]] to an 
uninitialized value to be rejected (though I would not expect an exception to 
be thrown from non-strict code) so I would expect the output for 'let' to be 
true, true, undefined, undefined, 2, 3 (the assignment of 'a' silently fails 
leaving the value unchanged).

I would suggest that specifying this as a piece of additional hidden internal 
state on a data descriptor (rejecting attempts to define or access the value 
prior to initialization, throwing in strict-mode  silently ignoring in 
non-strict) makes most sense on a number of grounds:

 – compatibility with existing non-strict const implementations.
 – consistency with appearance of var properties on the global object.
 – consistency with appearance of regular non-writable properties on other 
objects.
 – consistency with behaviour of access to regular non-writable properties on 
other objects.
 – data properties typically have higher performance in implementations, 
specifying all global let  const properties to be accessors may introduce 
unnecessary overhead.

cheers,
G.

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread gaz Heyes
I was pondering about this on twitter, at first I thought using cycrllic
variables to resolve references to object within the JSON object but
actually we just need this to work within object literals and be allowed
in the specification. For example this works currently :
({a:function(){
  return this.b;
},b:123}).a()

But it would be nicer to resolve this inside a object literal property to
be itself rather than window or undefined:
({a:this,b:123}).a.b

This would make JSON much smaller and allow circular references without
losing data.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Questions on Map and Set based on SpiderMonkey implementation

2012-01-22 Thread Andrea Giammarchi
random thoughts: it still looks weird to me to prefix undefined and null
with object, plus there is no undefined [[class]], neither Null one so
despite the good intention, I believe null, NaN, and undefined, should
return [not an object] as well as primitives should return [primitive
String] as example, rather than [object String] since all primitives are
metaphorically frozen without extensions allowed ... i.e.

var s = ;
s.whatever = 123;
s.whatever; // undefined

br

On Sun, Jan 22, 2012 at 7:06 PM, Herby Vojčík he...@mailbox.sk wrote:

 David Bruant wrote:

 Hi,

 SpiderMonkey implementation landed today. I was looking over the commit
 [1] and had a question.
 There is a test:

 assertEq(Object.prototype.**toString.call(new Map), [object Map]);

 I can't see anything about it neither in the latest spec draft nor in
 the wiki proposal. Is it how Object.prototype.toString should behave. I
 assume yes, but since there is no mention somewhere, it's probably worth
 discussing it.
 Likewise for Set.


 It is how Object.prototype.toString always (at least since ES5) behaved.
 It was in the spec. The newest spec has:

 15.2.4.2 Object.prototype.toString ( )
 When the toString method is called, the following steps are taken:
 1. If the this value is undefined, return [object Undefined].
 2. If the this value is null, return [object Null].
 3. Let O be the result of calling ToObject passing the this value as the
 argument.
 4. If O has a [[NativeBrand]] internal property, let tag be the
 corresponding value from the Table 23.
 5. Else, let tag be the string value Object.
 6. Return the String value that is the result of concatenating the three
 Strings [object , tag, and ].

 And the Table23 that follows contains strings for native object types,
 like Number, Math, Array, JSON etc. There is nothing for Map not Set, but
 there is no Map or Set there in the first place (or I failed in searching
 them).

  David

 [1] 
 https://hg.mozilla.org/**mozilla-central/rev/**6a5e20a0f741https://hg.mozilla.org/mozilla-central/rev/6a5e20a0f741


 Herby

 __**_
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/**listinfo/es-discusshttps://mail.mozilla.org/listinfo/es-discuss

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Allen Wirfs-Brock

On Jan 22, 2012, at 11:58 AM, Jake Verbaten wrote:

 Most of the time the purpose of deep cloning objects is deep cloning data 
 structures. 
 
 It's been discussed that generically deep cloning proxies, privates and 
 functions is a non-trivial problem. 
 
 However it would be of value to have a mechanism to deep clone anything that 
 would be valid JSON (Limiting to JSON is arbitary, it's a well defined subset 
 and none of the subset would involve difficult points to resolve).

So why should we expect a native deep clone function to be significantly 
faster than a  JavaScript version of the same function running on a modern high 
performance  JS engine. After all, a clone basically just does function calls, 
property lookup and object creation and property creation. These really are the 
foundation operations of  most data structure intensive applications.  If a 
pure JS deep clone is too slow then many other data structure driven functions 
are also going to be too slow.  If, in fact, a pure JS implementation of deep 
clone on an optimizing engine is still significantly slower than a native code 
implementation on the same engine then  perhaps we would be better served to 
focus on eliminating the bottlenecks that slow down the JS version of deep 
clone instead for putting the effort into creating an native version of that 
particular function.

The following is just speculation...One possible such bottleneck might be whole 
object allocation. A JS clone function probably would have to allocate an empty 
object and then dynamically populate it by adding properties one at a time.  A 
native implementation is more like to have the ability to examine a complete 
object and create, in a single primitive operation, a new object with all of 
the same  properties as the original object.  In other words, a native 
implementation of deep clone is likely to use some sort of shallow clone 
operation is that not available to pure JS code.  This suggest that a better 
way to get faster deep cloning functions is to make a native shallow clone 
function available to JS code. 

Allen





___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Allen Wirfs-Brock

On Jan 22, 2012, at 11:58 AM, Jake Verbaten wrote:

 Most of the time the purpose of deep cloning objects is deep cloning data 
 structures. 
 
 It's been discussed that generically deep cloning proxies, privates and 
 functions is a non-trivial problem. 
 
 However it would be of value to have a mechanism to deep clone anything that 
 would be valid JSON (Limiting to JSON is arbitary, it's a well defined subset 
 and none of the subset would involve difficult points to resolve).

So why should we expect a native deep clone function to be significantly 
faster than a  JavaScript version of the same function running on a modern high 
performance  JS engine. After all, a clone basically just does function calls, 
property lookup and object creation and property creation. These really are the 
foundation operations of  most data structure intensive applications.  If a 
pure JS deep clone is too slow then many other data structure driven functions 
are also going to be too slow.  If, in fact, a pure JS implementation of deep 
clone on an optimizing engine is still significantly slower than a native code 
implementation on the same engine then  perhaps we would be better served to 
focus on eliminating the bottlenecks that slow down the JS version of deep 
clone instead for putting the effort into creating an native version of that 
particular function.

The following is just speculation...One possible such bottleneck might be whole 
object allocation. A JS clone function probably would have to allocate an empty 
object and then dynamically populate it by adding properties one at a time.  A 
native implementation is more like to have the ability to examine a complete 
object and create, in a single primitive operation, a new object with all of 
the same  properties as the original object.  In other words, a native 
implementation of deep clone is likely to use some sort of shallow clone 
operation is that not available to pure JS code.  This suggest that a better 
way to get faster deep cloning functions is to make a native shallow clone 
function available to JS code. 

Allen





___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Wes Garland
On 22 January 2012 16:05, Jake Verbaten rayn...@gmail.com wrote:

 The idea here is that methods do not belong in data structures (clone
 should be to efficiently clone data).


Method vs. Property is a false dichotomy in functional languages, IMO.  A
method is merely a property whose value is a function instead of some other
type.

-- 
Wesley W. Garland
Director, Product Development
PageMail, Inc.
+1 613 542 2787 x 102
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Allen Wirfs-Brock

On Jan 21, 2012, at 12:07 PM, Brendan Eich wrote:

 This was already proposed. See the whole strawman, but in particular these 
 sections:
 
 http://wiki.ecmascript.org/doku.php?id=strawman:private_names#the_private_declaration
 http://wiki.ecmascript.org/doku.php?id=strawman:private_names#private_declaration_scoping
 http://wiki.ecmascript.org/doku.php?id=strawman:private_names#private_declarations_exist_in_a_separate_name_space_parallel_to_the_variable_binding_environment
 
 The last really was too much for some folks. It makes the meaning of an 
 identifier after . or before : in an object literal depend on a binding 
 declaration, possibly far above.
 
 We could revive this, but deferring it and simplifying led to
 
 http://wiki.ecmascript.org/doku.php?id=harmony:private_name_objects
 
 which is in ES6.
 

There has also been a number of discussion threads here about syntax for 
private name access.

I personally have come to the conclusions that

   obj.@foo

would be a better than 

   obj[foo]

for accessing a property of obj that is keyed by the private name that is the 
value of the foo binding.

My impression, is that a number of other participants in these discussion share 
this opinion.  These are various reasons for  this preference, including 
pleasantness, experience from CoffeeScript and a desire (rationalize in 
http://wiki.ecmascript.org/doku.php?id=strawman:object_model_reformation ) to 
strongly distinguish routine structural property access from dynamically 
computed data key accesses.

The plan of record is that ES6 will support the creation of private named 
properties in object literals using syntax like this:

const foo = name.create();
let obj = {
   [foo]: 42
};

However, if @foo is going to be used for private named member accesses instead 
of [foo] then it also makes sense to use @ instead of [ ] in object literal 
property definitions. In that case, we should replace the above with:

const foo = name.create();
let obj = {
   @foo: 42
};

Note that this doesn't run into any of the scoping or multiple name space 
issues that were raised as objections to the original private name proposals 
liked above.  Also it doesn't preclude use of [ ] to access private names. You 
could still say either
   obj[foo] or obj.@foo to access the properties whose key is the value of foo

 I plan on proposing at the next TC39 meeting that we support .@ member 
accesses and that we replace the use of [expr ] to define private named 
properties in object literals ( 
http://wiki.ecmascript.org/doku.php?id=harmony:object_literals#object_literal_computed_property_keys
 ) with @identifier to define such properties.

Regardless of whether this proposal flies we could consider supporting:

private foo,bar;

as a short hand for:

//assume already done: import name as @names;  
const foo=name.create(), bar=name.create();

I think this would be a desirable addition, but I don't want it to be a make or 
break issue for the .@ proposal.

There are a couple of decision that still need to make for this proposal:

1) should .@ member access and @ object literal property definitions permit he 
property key to be any toString-able value and not just private name values?  
The current plan of record does not require a private name value in the 
analogous contexts.
I'm slightly inclined towards requiring private name values, but would be happy 
either way.

2)  elimination of arbitrary expression as direct keys in object literal 
property definitions:

The current computed property keys proposals allows things like:

for (var n=0;n10;) {
   a.push( {
  [prop+n]: n++
   });
}

Do we really need to support this sort of computed property name definition?  
If so, we could probably allow something such as:

for (var n=0;n10;) {
   a.push( {
 @(prop+n): n++
   });
}

I'm include to not supporting the such arbitrary expressions in such property 
definitions, particularly if 1) above is decided as no.  Then this could be 
expressed as

for (var n=0;n10;) {
   let k = prop+n;
   a.push( {
 @k: n++
   });
}

3) should @foo as a primary expression be interpreted as this.@foo

I think it should, but note that this means that 

const foo = name.create();
let obj = {
   @foo: @foo
};

would mean the same as:

const foo = name.create();
let obj = {
   @foo: this.@foo  /key and value probably different values
};

rather than:

const foo = name.create();
let obj = {
   @foo: foo  //key and value are the same value
};

This might be a source of confusion for some JS programmers.

Thoughts?

Allen___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Rick Waldron
On Sun, Jan 22, 2012 at 8:13 PM, Wes Garland w...@page.ca wrote:

 On 22 January 2012 16:05, Jake Verbaten rayn...@gmail.com wrote:

 The idea here is that methods do not belong in data structures (clone
 should be to efficiently clone data).


 Method vs. Property is a false dichotomy in functional languages, IMO.  A
 method is merely a property whose value is a function instead of some other
 type.


Right, as it is defined in 4.3.27 (see: http://es5.github.com/#x4.3.27) and
they shouldn't be lost as a result of a copy/cloning process.





 --
 Wesley W. Garland
 Director, Product Development
 PageMail, Inc.
 +1 613 542 2787 x 102

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Brendan Eich

Rick Waldron mailto:waldron.r...@gmail.com
January 22, 2012 12:50 PM
This is a perfect use case for the forth-coming module system (similar 
to the way Globalization is being developed). Dave Herman and I had a 
brief over Twitter exchange that began with my desire for a 
migration of parseInt and parseFloat to Math,


or Number -- IIRC, Crock proposed making better-behaved parse methods 
live there.


which I followed with a suggestion to do the same with setTimeout and 
setInterval (despite those currently existing in the realm of DOM 
APIs) to the imaginary Timer object.


Right :-P.

/be

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Tab Atkins Jr.
On Sun, Jan 22, 2012 at 4:31 PM, Allen Wirfs-Brock
al...@wirfs-brock.com wrote:
 I personally have come to the conclusions that

    obj.@foo

 would be a better than

    obj[foo]

 for accessing a property of obj that is keyed by the private name that is
 the value of the foo binding.

 My impression, is that a number of other participants in these discussion
 share this opinion.  These are various reasons for  this preference,
 including pleasantness, experience from CoffeeScript and a desire
 (rationalize
 in http://wiki.ecmascript.org/doku.php?id=strawman:object_model_reformation )
 to strongly distinguish routine structural property access from dynamically
 computed data key accesses.

I have a question, that may be an objection.  In my blog post
http://www.xanthir.com/blog/b4FJ0, I demonstrate several example
usages of Names.  Most of them involve using a Name that's hung off of
another object.  Is there a way to support this pattern without a
temporary variable?

For example, from my post:

myCoolObject.prototype[Iterator.getIterator] = function(){...}

Using @ for access, would
myCoolObject.prototype.@Iterator.getIterator = function(){...} work,
or would that attempt to retrieve a property using Iterator as a
Name, then retrieve the getIterator property of that?

If the latter, this is rather inconvenient for what I expect will be
common patterns.

 The plan of record is that ES6 will support the creation of private named
 properties in object literals using syntax like this:

 const foo = name.create();
 let obj = {
    [foo]: 42
 };

 However, if @foo is going to be used for private named member accesses
 instead of [foo] then it also makes sense to use @ instead of [ ] in object
 literal property definitions. In that case, we should replace the above
 with:

 const foo = name.create();
 let obj = {
    @foo: 42
 };

 Note that this doesn't run into any of the scoping or multiple name space
 issues that were raised as objections to the original private name proposals
 liked above.  Also it doesn't preclude use of [ ] to access private names.
 You could still say either
    obj[foo] or obj.@foo to access the properties whose key is the value of
 foo

Regardless of the answer above, this seems to allow the actual usage
in my post, which is nice.  But it appears that it will be impossible
to use the pattern I cite in an object literal without assigning it to
a temporary variable.  Is this correct?


 1) should .@ member access and @ object literal property definitions permit
 he property key to be any toString-able value and not just private name
 values?  The current plan of record does not require a private name value in
 the analogous contexts.
 I'm slightly inclined towards requiring private name values, but would be
 happy either way.

I don't understand what the use of a toString-able value would be in
the context of a private variable, since you can only store private
things with Names.


 2)  elimination of arbitrary expression as direct keys in object literal
 property definitions:

 The current computed property keys proposals allows things like:

 for (var n=0;n10;) {
    a.push( {
       [prop+n]: n++
    });
 }

 Do we really need to support this sort of computed property name definition?
  If so, we could probably allow something such as:

 for (var n=0;n10;) {
    a.push( {
      @(prop+n): n++
    });
 }

 I'm include to not supporting the such arbitrary expressions in such
 property definitions, particularly if 1) above is decided as no.  Then this
 could be expressed as

 for (var n=0;n10;) {
    let k = prop+n;
    a.push( {
      @k: n++
    });
 }

This seems related to my concern above, except that this example uses
string-valued variables rather than Names.


 3) should @foo as a primary expression be interpreted as this.@foo

 I think it should, but note that this means that

 const foo = name.create();
 let obj = {
    @foo: @foo
 };

 would mean the same as:

 const foo = name.create();
 let obj = {
    @foo: this.@foo  /key and value probably different values
 };

 rather than:

 const foo = name.create();
 let obj = {
    @foo: foo  //key and value are the same value
 };

 This might be a source of confusion for some JS programmers.

I suspect that either would be confusing.  I agree with you that it's
better for @foo to be interpreted as this.@foo.

~TJ
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Rick Waldron

 Node.js does *not* conform. Not at all. Not only it doesn't clamp to 4ms
 (which happens to be a good thing, IMO), but its timers often fire out of
 order !


Is there a reference or test case you can cite for this?  Thanks!

Rick
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Mikeal Rogers

On Jan 22, 2012, at January 22, 20121:35 PM, Jorge wrote:
 . Not at all. Not only it doesn't clamp to 4ms (which happens to be a good 
 thing, IMO), but its timers often fire out of order !

node.js does not conform to the 4ms clamp because that would be silly. It does 
not fire timers out of order, that I know of. If you have a case where that is 
not true then it's a bug in libuv (setTimeout's event system is in libuv now) 
that we need to have fixed.

-Mikeal

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Brendan Eich

Allen Wirfs-Brock mailto:al...@wirfs-brock.com
January 22, 2012 4:31 PM

I personally have come to the conclusions that

   obj.@foo

would be a better than

   obj[foo]

for accessing a property of obj that is keyed by the private name that 
is the value of the foo binding.


My impression, is that a number of other participants in these 
discussion share this opinion.  These are various reasons for  this 
preference, including pleasantness, experience from CoffeeScript and a 
desire (rationalize in 
http://wiki.ecmascript.org/doku.php?id=strawman:object_model_reformation ) 
to strongly distinguish routine structural property access from 
dynamically computed data key accesses.


If we require only private name objects on the right of @, then there's 
another benefit: no misspelled references to public name where a private 
one is required. @ means private, not computed.


Regardless of whether this proposal flies we could consider supporting:

private foo,bar;

as a short hand for:

//assume already done: import name as @names;
const foo=name.create(), bar=name.create();

I think this would be a desirable addition, but I don't want it to be 
a make or break issue for the .@ proposal.


I'm game. Without it the Name.create() overhead is onerous.


There are a couple of decision that still need to make for this proposal:

1) should .@ member access and @ object literal property definitions 
permit he property key to be any toString-able value and not just 
private name values?  The current plan of record does not require a 
private name value in the analogous contexts.
I'm slightly inclined towards requiring private name values, but would 
be happy either way.


As noted above, I'm inclined toward requiring private name objects on 
the right of @.


2)  elimination of arbitrary expression as direct keys in object 
literal property definitions:


The current computed property keys proposals allows things like:

for (var n=0;n10;) {
   a.push( {
  [prop+n]: n++
   });
}

Do we really need to support this sort of computed property name 
definition?


Not obviously at this point. We might want [] and @ but we can certainly 
defer [] if we do include @ for private names.



3) should @foo as a primary expression be interpreted as this.@foo

I think it should, but note that this means that

const foo = name.create();
let obj = {
   @foo: @foo
};

would mean the same as:

const foo = name.create();
let obj = {
   @foo: this.@foo  /key and value probably different values
};

rather than:

const foo = name.create();
let obj = {
   @foo: foo  //key and value are the same value
};

This might be a source of confusion for some JS programmers.


It's not different from let obj = {foo: foo} which uses foo two 
different ways. We agreed on the shorthand from object destructuring 
being necessary (due to the cover grammar technique we are using -- 
Supplemental Syntax) and sometimes desirable, e.g.


  function Point(x, y) {
return {x, y};
  }

So I do not think @foo meaning something different in a property name 
context in an object literal from what it means in an expression is 
either new or necessarily confusing. In both the @ and @-free cases, the 
property name means something different from the expression form.


/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Question about the “full Unicode in strings” strawman

2012-01-22 Thread Mathias Bynens
http://wiki.ecmascript.org/doku.php?id=strawman:support_full_unicode_in_strings#unicode_escape_sequences
states:

 To address this issue, a new form ofUnicodeEscapeSequence is added that is 
 explicitly tagged as containing var variable number (up to 8) of hex digits. 
 The new definition is:

 UnicodeEscapeSequence ::
 u HexDigit HexDigit HexDigit HexDigit
 u{ HexDigit 
 HexDigitopt HexDigitopt HexDigitopt HexDigitopt HexDigitopt HexDigitopt HexDigitopt }

 The \u{ } extended UnicodeEscapeSequence is a syntactic extension that is 
 only recognized after explicit versioning opt-in to the extended “Harmony” 
 syntax.

Why up to 8 hex digits? Shouldn’t 6 hex digits suffice to represent
every possible Unicode character (in the range from 0x0 to 0x10)?

Is this a typo or was this done intentionally to be future-compatible
with potential Unicode additions?
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Andrea Giammarchi
var d = new Date, i = setInterval(function () {console.log(new Date - d); d
= new Date;}, 1);

most likely gonna fire a sequence of

10
0
11
0
12
0
10
0
11
0
10
0

... not really reliable, even if delay is specified to 10 o 20, does not
look that consistent

with setTimeout, I have tried delay 3 and it's never less than 10 or 11,
but with delay 1 is almost always 0:
var d = new Date, delay = 3, i = setTimeout(function t() {console.log(new
Date - d); d = new Date; i = setTimeout(t, delay);}, delay);

br

On Mon, Jan 23, 2012 at 4:43 AM, Rick Waldron waldron.r...@gmail.comwrote:

 Node.js does *not* conform. Not at all. Not only it doesn't clamp to 4ms
 (which happens to be a good thing, IMO), but its timers often fire out of
 order !


 Is there a reference or test case you can cite for this?  Thanks!

 Rick

 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss


___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Brendan Eich

Tab Atkins Jr. mailto:jackalm...@gmail.com
January 22, 2012 7:36 PM
myCoolObject.prototype[Iterator.getIterator] = function(){...}
Using @ for access, would 
myCoolObject.prototype.@Iterator.getIterator = function(){...} work, 
or would that attempt to retrieve a property using Iterator as a 
Name, then retrieve the getIterator property of that?


To quote D. Duck, pronoun trouble. By your final that, you mean the 
iterator Name instance? If so, no way -- that doesn't make any sense. A 
dot operator in JS accesses a property value, not key.


So rest assured: the former.

/be
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: shortcuts for defining block-local private names, plays nicely with @foo syntax

2012-01-22 Thread Tab Atkins Jr.
On Sun, Jan 22, 2012 at 11:25 PM, Brendan Eich bren...@mozilla.org wrote:
 Tab Atkins Jr. mailto:jackalm...@gmail.com
 January 22, 2012 7:36 PM

 myCoolObject.prototype[Iterator.getIterator] = function(){...}
 Using @ for access, would myCoolObject.prototype.@Iterator.getIterator =
 function(){...} work, or would that attempt to retrieve a property using
 Iterator as a Name, then retrieve the getIterator property of that?


 To quote D. Duck, pronoun trouble. By your final that, you mean the
 iterator Name instance? If so, no way -- that doesn't make any sense. A dot
 operator in JS accesses a property value, not key.

 So rest assured: the former.

Allow me to be clearer.

Given foo.bar = new Name();, is b...@foo.bar equivalent to
baz[foo.bar] or baz[foo].bar?  Normal property-access semantics
would give the latter.  If so, then we need to preserve the [] form
for use with private names in both the baz[foo.bar] form and the
{[foo.bar]: true} form, unless we find it acceptable for authors to
be forced to use a local variable to store the Name every time.

~TJ
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Shouldn't timers be specified?

2012-01-22 Thread Jorge
On 23/01/2012, at 04:57, Mikeal Rogers wrote:
 On Jan 22, 2012, at January 22, 20121:35 PM, Jorge wrote:
 . Not at all. Not only it doesn't clamp to 4ms (which happens to be a good 
 thing, IMO), but its timers often fire out of order !
 
 node.js does not conform to the 4ms clamp because that would be silly.

Exactly http://groups.google.com/group/nodejs-dev/msg/788492357732e93e

 It does not fire timers out of order, that I know of.

http://groups.google.com/group/nodejs-dev/browse_thread/thread/922a30cf88a1b784

 If you have a case where that is not true then it's a bug in libuv 
 (setTimeout's event system is in libuv now) that we need to have fixed.

The test that's been disabled:

https://github.com/joyent/node/blob/master/test/simple/test-next-tick-ordering.js#L50-54
-- 
Jorge.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss