Re: [webcomponents]: Of weird script elements and Benadryl

2013-04-16 Thread Allen Wirfs-Brock

On Apr 16, 2013, at 3:13 PM, Dimitri Glazkov wrote:

 On Tue, Apr 16, 2013 at 3:07 PM, Daniel Buchner dan...@mozilla.com wrote:
 One thing I've heard from many of our in-house developers, is that they
 prefer the imperative syntax, with one caveat: we provide an easy way to
 allow components import/require/rely-upon other components. This could
 obviously be done using ES6 Modules, but is there anything we can do to
 address that use case for the web of today?
 
 Yes, one key ability we lose here is the declarative quality -- with
 the declarative syntax, you don't have to run script in order to
 comprehend what custom elements could be used by a document.


My sense is that the issues of concern (at least on this thread) with 
declaratively defining custom elements all related to how custom behavior (ie, 
script stuff) is declaratively associated. I'm not aware (but also not very 
familiar) with similar issues relating to template and other possible 
element subelement.  I also imagine that there is probably a set of use cases 
that don't actually need any custom behavior.

That suggests to me, that a possible middle ground, for now,  is to  still have 
declarative custom element definitions but don't provide any declarative 
mechanism for associating script with them.  Imperative code could presumably 
make that association, if it needed to.

I've been primarily concerned about approaches that would be future hostile 
toward the use of applicable ES features that are emerging.  I think we'll be 
see those features in browsers within the next 12 months. Deferring just the 
script features of element would help with the timing and probably allow a 
better long term solution to be designed.

Allen


Re: [webcomponents]: Of weird script elements and Benadryl

2013-04-16 Thread Allen Wirfs-Brock

On Apr 16, 2013, at 4:08 PM, Daniel Buchner wrote:

 Deferring just the script features of element would help with the timing 
 and probably allow a better long term solution to be designed.
 
 If the callbacks are not mutable or become inert after registration (as I 
 believe was the case), how would a developer do this -- Imperative code 
 could presumably make that association, if it needed to.

Here is what I suggested earlier on this thread for what to do if a 
constructor= attribute wasn't supplied, when we were talking about that 
scheme:

 1) create a new anonymous constructor object that inherits  from HTMLElement. 
  It wouldn't have any unique behavior but it would be uniquely associated 
 with the particular element that defined it and it might be useful for 
 doing instanceof tests.  It would be the constructor that you register with 
 the tag.

If that was done, it seems reasonable that the provided constructor object 
could be available as the value of an attribute of the HTMLElementElement that 
corresponds to the element.  So, imperative code could lookup the 
HTMLElementElement based on its name property and retrieve the constructor 
object.  The constructor object would have a prototype whose value is the 
actual prototype object used for these custom elements objects and the 
imperative code could assign methods.  The script that assigns such methods 
would need to be placed to run after the element is parsed but before any 
other imperative code that actually makes use of those methods.  

Prototype objects are not normally immutable so there is no problem with 
delaying the installation of such methods even until after instances of the 
custom element have actually been created by the HTML parser.

Allen






 
 On Tue, Apr 16, 2013 at 3:47 PM, Allen Wirfs-Brock al...@wirfs-brock.com 
 wrote:
 
 On Apr 16, 2013, at 3:13 PM, Dimitri Glazkov wrote:
 
  On Tue, Apr 16, 2013 at 3:07 PM, Daniel Buchner dan...@mozilla.com wrote:
  One thing I've heard from many of our in-house developers, is that they
  prefer the imperative syntax, with one caveat: we provide an easy way to
  allow components import/require/rely-upon other components. This could
  obviously be done using ES6 Modules, but is there anything we can do to
  address that use case for the web of today?
 
  Yes, one key ability we lose here is the declarative quality -- with
  the declarative syntax, you don't have to run script in order to
  comprehend what custom elements could be used by a document.
 
 
 My sense is that the issues of concern (at least on this thread) with 
 declaratively defining custom elements all related to how custom behavior 
 (ie, script stuff) is declaratively associated. I'm not aware (but also not 
 very familiar) with similar issues relating to template and other possible 
 element subelement.  I also imagine that there is probably a set of use 
 cases that don't actually need any custom behavior.
 
 That suggests to me, that a possible middle ground, for now,  is to  still 
 have declarative custom element definitions but don't provide any declarative 
 mechanism for associating script with them.  Imperative code could presumably 
 make that association, if it needed to.
 
 I've been primarily concerned about approaches that would be future hostile 
 toward the use of applicable ES features that are emerging.  I think we'll be 
 see those features in browsers within the next 12 months. Deferring just the 
 script features of element would help with the timing and probably allow a 
 better long term solution to be designed.
 
 Allen
 



Re: [webcomponents]: Of weird script elements and Benadryl

2013-04-14 Thread Allen Wirfs-Brock

On Apr 14, 2013, at 10:49 AM, Scott Miles wrote:

   the challenge with creating a normal constructor
 
 Forgive me if my language is imprecise, but the basic notion is that in 
 general one cannot create a constructor that creates a DOM node because 
 (most? all?) browsers make under the hood mappings to internal code (C++ for 
 Blink and Webkit). For example, note that HTMLElement and descendents are not 
 callable from JS. 
 
 Erik Arvidsson came up with a strategy for overcoming this in Blink, but to 
 my recollection Boris Zbarsky said this was a non-starter in Gecko. 
 
 Because of this constraint Dimitri's current system involves supplying only a 
 prototype to the system, which hands you back a generated constructor.

I addressed this issue in a follow message.

For background on the problem and general solution see 
http://wiki.ecmascript.org/lib/exe/fetch.php?id=meetings%3Ameeting_jan_29_2013cache=cachemedia=meetings:subclassing_builtins.pdf
 

Also http://www.2ality.com/2013/03/subclassing-builtins-es6.html 

Allen




Re: [webcomponents]: Of weird script elements and Benadryl

2013-04-14 Thread Allen Wirfs-Brock

On Apr 14, 2013, at 11:40 AM, Scott Miles wrote:

  Here are four ways to avoid the subclassing problem for custom elements
  1)  Only allow instances of custome dom elements to be instantiated using 
  document.createElement(x-foo). 
 
 Wearing web developer hat, I never make elements any other way than 
 createElement (or HTML), so this would be standard operating procedure, so 
 that's all good if we can get buy in.

However, I believe that some people such as Alex Russell have been advocating 
that WebIDL should be allow constructor-like interfaces  to support expressions 
such as:
   new HTMLWhateverElement()

It would be future hostile to make that impossible, but support could 
reasonably wait for ES6 support.

 
  2, 3, 4
 
 I believe have been suggested in one form or another, but as I mentioned, 
 were determined to be non-starters for Gecko. I don't think we've heard 
 anything from IE team.

Well #4 has been accepted for ES6 by all TC39 participants including Mozilla 
and Microsoft and is going to happen.  The basic scheme was actually suggested 
a member of the SpiderMonkey team so I'm sure we'll get it worked out for 
Gecko. 

Allen




Re: [webcomponents]: Platonic form of custom elements declarative syntax

2013-04-11 Thread Allen Wirfs-Brock

On Apr 11, 2013, at 9:32 AM, Boris Zbarsky wrote:

 On 4/11/13 12:23 PM, Allen Wirfs-Brock wrote:
 So why don't you make register a static method on HTMLElement and then
 define the element semantics so it automatically does:
 MyElement.register()
 
 This would normally invoke the inherited static method
 
 I lost track of something here.  Why would it do that?  Does MyElement have 
 Element on its proto chain?  MyElement.prototype certainly has 
 Element.prototype on _its_ proto chain, but that's a separate concern from 
 what happens with the interface objects...  Is this something that ES6 
 classes define differently from current WebIDL and implementations, and if 
 so, do we need to align the two somehow?

Yes ES6 has class-side inheritance.  The ES5 equivalent for:

   class Sub extends Super {
   constructor() {/*constructor body */ }
   method1 () {}
   static method2 {}
   }

is:

  function Sub() {/*constructor body */ }
  Sub.__proto__ = Super;
  Sub.prototype = Object.create(Super.prototype);
  Sub.prototype.method1 = function method1() {};
  Sub.method2 = function method2 () {};


Sub.foo looks first looks for a own property on Sub, then on Super, etc.

Allen





Re: [webcomponents]: Blocking custom elements on ES6, was: Platonic form of custom elements declarative syntax

2013-04-11 Thread Allen Wirfs-Brock

On Apr 11, 2013, at 10:59 AM, Dimitri Glazkov wrote:

 Hello, TC39 peeps! I am happy to have you and your expertise here.
 
 On Wed, Apr 10, 2013 at 11:14 PM, Allen Wirfs-Brock
 al...@wirfs-brock.com wrote:
 
 This can all be expresses, but less clearly and concisely using ES3/5 
 syntax.  But since we are talking about a new HTML feature, I'd recommend 
 being the first major HTMLfeature to embrace ES6 class syntax.  The class 
 extension in ES6 are quite stable and quite easy to implement.  I'm pretty 
 sure they will begin appearing in browsers sometime in the next 6 months. If 
 webcomponents takes a dependency upon them, it would probably further speed 
 up their implementation.
 
 We simply can't do this :-\ I see the advantages, but the drawbacks of
 tangled timelines and just plain not being able to polyfill custom
 elements are overwhelming. Right now, there are at least two thriving
 polyfills for custom elements
 (https://github.com/toolkitchen/CustomElements and
 https://github.com/mozilla/web-components), and they contribute
 greatly by both informing the spec development and evangelizing the
 concepts with web developers.
 
 To state simply: We must have support both ES3/5 and ES6 for custom elements.
 
 :DG
 

ES6 classes can be pollyfilled:

  class Sub extends Super {
  constructor() {/*constructor body */ }
  method1 () {}
  static method2 {}
  }

is:

 function Sub() {/*constructor body */ }
 Sub.__proto__ = Super;
 Sub.prototype = Object.create(Super.prototype);
 Sub.prototype.method1 = function method1() {};
 Sub.method2 = function method2 () {};

Allen 




Re: [webcomponents]: Platonic form of custom elements declarative syntax

2013-04-11 Thread Allen Wirfs-Brock

On Apr 11, 2013, at 12:04 PM, Boris Zbarsky wrote:

 On 4/11/13 12:55 PM, Boris Zbarsky wrote:
 On 4/11/13 12:50 PM, Allen Wirfs-Brock wrote:
 Yes ES6 has class-side inheritance
 
 OK.  Should we be doing that with WebIDL interface objects, perhaps?  It
 would certainly make sense to me to do that, as long we we don't run
 into web compat issues.
 
 I've filed https://bugzilla.mozilla.org/show_bug.cgi?id=860841 with a patch 
 to do this.  The specific behavior I'm implementing is that the prototype of 
 the interface object for interface X is the interface object of the nearest 
 ancestor of X that has one, and Function.prototype if there is no such 
 ancestor.
 
 So for example, with that patch Object.getPrototypeOf(HTMLElement) == 
 Element, and Object.getPrototypeOf(XMLHttpRequest) == EventTarget.
 
 Note that this doesn't quite match the proto chain, because 
 Object.getPrototypeOf(XMLHttpRequest.prototype) is the prototype object for 
 the XMLHttpRequestEventTarget interface, but that interface is 
 [NoInterfaceObject].
 
 -Boris
 

That sounds about right.  In ES6 you will still be able to wire-up arbitrary 
[[Prototype]] chains on both the instance and constructor-side using 
Object.create, __proto__ (now part of the standard), and ossibly crazy things 
via Proxy.  But the most concise way to define class-like abstractions is going 
to be via class declaration.  It seem quite desirable that the normal case 
for such abstractions specified via WebIDL is that they simply follow the ES6 
class pattern. Exceptions are fine for legacy or special circumstances. 

Allen






Re: IndexedDB: undefined parameters

2012-10-11 Thread Allen Wirfs-Brock

On Oct 10, 2012, at 10:57 PM, Jonas Sicking wrote:

 On Wed, Oct 10, 2012 at 7:15 PM, Brendan Eich bren...@mozilla.org wrote:
 Boris Zbarsky wrote:
 
 Should undefined, when provided for a dictionary entry, also be treated
 as not present?  That is, should passing a dictionary like so:
 
  { a: undefined }
 
 be equivalent to passing a dictionary that does not contain a at all?
 
 ES6 says no. That's a bridge too far. Parameter lists are not objects!
 
 I thought the idea was that for something like:
 
 function f({ a = 42 }) {
  console.log(a);
 }
 obj = {};
 f({ a: obj.prop });
According to the ES6 spec. this evaluates to exactly the same call as:

f({a: undefined});

According to ES6 all of the following will log 42:

f({});
f({a: undefined});
f({a: obj.prop});

 
 that that would log 42.
 
 What is the reason for making this different from:
 
 function f(a = 42) {
  console.log(a);
 }
 obj = {};
 f(obj.prop);

same as:
 f(undefined);
and
 f();

Again, all log 42 according to the ES6 rules.


Finally, note that in EsS6 there is a way to distinguish between  an absent 
parameter and an explicitly passed undefined and still destructure an arguments 
object:

function f(arg1) {
   if (arguments.length  1) return f_0arg_overload();
   var [{a = 42} = {a: 42} ] = arguments;  //if arg1 is undefined destructure  
{a:42} else destructure arg1using a default value for property a
   ...
}

 
 It seems to me that the same it'll do the right thing in all
 practical contexts argument applied equally to both cases?

It really seems contra-productive for WebIDL to try to have defaulting rules 
for option objects that are different from the ES6 destructuring rules for 
such objects. 

Allen




Re: What type should .findAll return

2011-11-14 Thread Allen Wirfs-Brock

On Nov 12, 2011, at 12:07 PM, Yehuda Katz wrote:

 
 Yehuda Katz
 (ph) 718.877.1325
 
 
 On Sat, Nov 12, 2011 at 11:51 AM, Allen Wirfs-Brock al...@wirfs-brock.com 
 wrote:
 
 On Nov 12, 2011, at 10:27 AM, Boris Zbarsky wrote:
 
  On 11/13/11 6:10 AM, Allen Wirfs-Brock wrote:
 
 
  I think you're drawing a distinction between language level semantics and 
  library routine behavior which is for practical purposes irrelevant to 
  everyone outside the ES5 committee...
 
 It's relevant to this discussion because you have to decide what web 
 developers actually mean when they say Array.  The starting point was that 
 you want NodeArray to be something more than just an Array.  So you have do 
 look at all aspects of Array behavior and decide which you care about.  The 
 language semantic vs library behavior is relevant because it is often much 
 easer for engine implementers to change or extend library behavior then it is 
 to extend language semantics.
 
 
  In practice, at the moment, if you want something that needs to act like an 
  array as far as a web developer is concerned, it's [[Class]] better be 
  Array.  In the future, as you note, that might change.
 
 The most important point is that [[Class]] is neither the only nor the most 
 important distinguishing characteristic of ECMAScript built-in Arrays.  If 
 you are just arguing about [[Class]] you are missing the point.
 
 I think it's worth noting that [[Class]] is actually used by jQuery and other 
 implementations to identify whether an object is a real Array. It may be 
 the case that we could revisit some of those cases, but the technique of 
 using [[Class]] to get a less buggy picture of what an object is (compared to 
 typeof etc.) is pretty common. We use it in SproutCore as well.
 
 The jQuery.type function:
 https://github.com/jquery/jquery/blob/master/src/core.js#L491-495
 
 The class2type map:
 https://github.com/jquery/jquery/blob/master/src/core.js#L877-879
 
 toString in that function is declared above as Object.prototype.toString.
 
 That said, of course other aspects of the observed behavior, such as its 
 exposed methods, matter as well.

Those functions are not using [[Class]].  They are using the standard 
built-in Object.prototype.toString method.  Now, it so happens that the 
specification of toString makes use of [[Class]] but that is simply an artifact 
of the ES5.1 specification.  It is not a language features.  The technique that 
is used to specify toString can be changed without changing the actual behavior 
of the toString method.  All that is really required that existing ES code that 
depends upon the ES5.1 toString behavior will continue to work without 
modification  in future ES implementations that may use a different 
specification for toString.  However, it doesn'tconstrain future code that 
operates upon new kinds of objects that didn't exist in the ES5.1 specification.

BTW, when the ES5.1 spec. talks about objects whose [[Class]] has a specific 
value. it means precisely such objects as actually specified in the ES5.1 spec. 
[[Class]] is not an implementation extension point.  In particular, ES5.1 
clause 8.6.2 says: 
  The value of the [[Class]] internal property of a host object may be 
any String value except one of Arguments, Array,...

In other words, host object provides (such as a DOM implementation) are not 
allowed to define new kinds of objects whose [[Class]] is Array.

It's fine to want to define a new kind of host object that is behaviorally very 
similar (but slight different) from instances of the built-in Array 
constructor.  But characterizing such objects by saying they have 
[[Class]]==Array
 is a not meaningful from a ES5.1 specification perspective.

Allen

Re: What type should .findAll return

2011-11-14 Thread Allen Wirfs-Brock

On Nov 14, 2011, at 3:32 PM, Yehuda Katz wrote:

 Sorry,
 
 I was making a joke (referencing 1.5.2 of the HTML5 spec), not intending to 
 be confrontational.
 
 The underlying issue here is just making it possible for Array.isArray to 
 return true for an Array of DOM nodes that is also enhanced with extra 
 features. Jonas had specifically said that he wanted isArray to work. Rick 
 then pointed out that the spec seems to disallow host objects from claiming 
 that their [[Class]] is Array, and that isArray specifically requires that 
 [[Class]] be Array.

Ultimate you have to decide what it is you are asking for.  It been stated that 
you (DOM API designers) want this kind of object to be a real ECMAScript Array. 
 But you also want to deviate from some aspects of what it currently means to 
be an a real ECMAScript array.  A real ECMAscript Array has a specific 
[[Prototype]] value.  It also has specific behaviors for methods like concat 
and filter and other specific distinguishing behavioral characteristics all of 
which are defined in the ES5.1 spec.   If you change any of those for some 
object,  it is something other than a real ECMAScript Array.

Array.isArray was introduced into ES5 to provide an API for testing whether or 
not an object actually was a real ECMAScript Array as defined by section 15.4 
of the ES5 spec.  If Array.isArray starts answering  true for objects that 
aren't described by 15.4 then it ls no longer useful for its intended purpose.  

The language in 8.6.2 limiting host object use of certain class values is to 
ensure that host objects can't define things that violate important invariant 
about the inner workings of ECMAScript.  

Nobody is saying that it isn't useful to define new objects (host or otherwise) 
that share some (but not all) of the characteristics of ECMAScript Arrays.  
However, such object's aren't just  ECMAScript array as defined by 15.4 so 
don't expect Array.isArray to work for them.  Perhaps other discriminators are 
needed but we will all need to decide which specific subset of Array 
characteristics we want to discriminate. 

TC39 recognizes that ES needs better support for defining collections, 
including variants of Array.  This includes supporting both better collections 
defined in ES code and via host objects   (in general, TC39 don't like 
designs that depend uopn a host object being able to do something that can't be 
done only using ES code). We have features to support better collection 
definition in advanced stages of design for ES6.   Some of these features 
might be accelerated into implementation ahead of the completion of ES6.  
However, I'm not sure you would want to normatively specify a DOM feature that 
depended upon them. Maybe you could...

For right now, there are two ways you could quickly go that don't conflict with 
ES5.1 at all:

1) you can specify that .findAll returns a plain vanilla ECMAScript Array 
object.
2) you can define a new kind of host object that is a close approximation of a 
real ECMAScript Array object.  Such an object could indirectly inherit from 
Array.prototype, over-ride some inherited methods (such as concat and filter), 
and define additional DOM related methods.  However, its [[Class]] may not be 
Array and anything in the ES spec that requires [[Class]]===Array (such as 
Array.isArray) won't recognize it as an anything special.

We can work together on something between these two extremes but the further 
away from them we go the more we get into the land of ES6 features and the 
problem of how to advance them into current engines.

Allen


Re: What type should .findAll return

2011-11-12 Thread Allen Wirfs-Brock

On Nov 12, 2011, at 1:29 AM, Boris Zbarsky wrote:

 On 11/12/11 10:22 AM, Allen Wirfs-Brock wrote:
 Note that the only specialness of Array instances relates to what happens 
 when you add new array elements or dynamically change the value of the 
 length property.
 
 1)  In ES5 this is just not true; there are various parts of the spec that 
 check [[Class]].  Yes, I know we're working on getting rid of them, but we 
 haven't gotten to that future world yet.

Class related distinctions are covered in the document I reference: 
https://docs.google.com/document/d/1sSUtri6joyOOh23nVDfMbs1wDS7iDMDUFVVeHeRdSIw/edit?authkey=CI-FopgC
 and are generally secondary issues related to various library routines.  For 
example, whether JSON outputs the properties of an object using [ ] or { }. 
Notation. The only language level semantic specialness of Array is related to 
the length invarian
 
 2)  In implementations the above may or may not be true.

If it isn't the implementation are out of conformance with the standard that 
applies to them.  That means they are buggy and should be fixed.

 
 So, if you want the objects to be an immutable, array-like object that 
 inherits from array.prototype through an intermediate prototype there really 
 is no problem.  A JS programmer could express this today in ES5:
 
 var DOMFindResultProto = Object.create(Array.prototype);  //make it inherit 
 from Object.prototype
 DOMFondResultProto.someMethod = function O() { ...};
 //other prototype methods
 //...
 
 function FindResultFactory(nodes) {
var obj = Object.create(DOMFindResultProto);
for (var i=0; inodes.length;++i) obj[i]=nodes[i];
return Object.freeze(obj);
 }
 
 The result will not have the same performance characteristics as an actual 
 array in many of today's JITs, for what it's worth.  You can consider this a 
 bug in those JITs, of course.

It's an expected variance on optimization strategies that I don't think is 
particularly relevent to this discussion.  BTW, an equally valid statement 
would be: the result will have the same performance characteristics as an 
actual array in many of todays JITs that optimize all integer-indexed 
properties, regardless of whether or not an object is an actual Array instance.

Allen




Re: What type should .findAll return

2011-11-12 Thread Allen Wirfs-Brock

On Nov 12, 2011, at 10:27 AM, Boris Zbarsky wrote:

 On 11/13/11 6:10 AM, Allen Wirfs-Brock wrote:
 
 
 I think you're drawing a distinction between language level semantics and 
 library routine behavior which is for practical purposes irrelevant to 
 everyone outside the ES5 committee...

It's relevant to this discussion because you have to decide what web 
developers actually mean when they say Array.  The starting point was that 
you want NodeArray to be something more than just an Array.  So you have do 
look at all aspects of Array behavior and decide which you care about.  The 
language semantic vs library behavior is relevant because it is often much 
easer for engine implementers to change or extend library behavior then it is 
to extend language semantics.

 
 In practice, at the moment, if you want something that needs to act like an 
 array as far as a web developer is concerned, it's [[Class]] better be 
 Array.  In the future, as you note, that might change.

The most important point is that [[Class]] is neither the only nor the most 
important distinguishing characteristic of ECMAScript built-in Arrays.  If you 
are just arguing about [[Class]] you are missing the point.
 
 If it isn't the implementation are out of conformance with the standard
 that applies to them. That means they are buggy and should be fixed.
 
 The standard does not specify many aspects of implementation behavior, 
 including but not limited to performance characteristics.

Such as??  While there are still non-performance aspects  of implementation 
that are not yet fully specified we are working hard to eliminate (or at least 
minimize them).  WRT Array, other than performance (including space efficiency) 
and some aspects of the sort function, what do you think isn't fully specified 
in ES5.1?  I want to know so I can fix that in ES6

 However, to language _consumers_ (e.g. web developers) those 
 not-specified-in-the-standard aspects are still part of what it means to be 
 an array.

What are they? Really TC39 doesn't want to have such aspects.

 
 It seems to me that there is a serious disconnect here between the way people 
 are thinking about the standard for arrays and the simple it needs to act 
 just like an array in all observable ways request from web developers.

all observable ways means no methods that aren't already on Array.prototype.  
I might think if would be fine for .findAll to just return an actual Array 
instance.  But others seem to want to augment that behavior so all observable 
ways does not seem to apply.

 
 For purposes of the ES spec, all that matters is the precise specification of 
 arrays.  For purposes of web developers and web specs trying to return 
 array-like objects, these things the standard doesn't care about matter.

I have to say that I think you are totally mischaracterizing the ES spec and 
the position of TC39.  I don't understand why?
 
 
 It's an expected variance on optimization strategies that I don't think
 is particularly relevent to this discussion.
 
 See above.  It's 100% relevant to the public-webapps aspects of this 
 discussion.

Still not clear,  are you saying that all implementation are expect to apply 
the same optimizations?  That clearly isn't the case today.

 
 BTW, an equally valid
 statement would be: the result will have the same performance
 characteristics as an actual array in many of todays JITs that optimize
 all integer-indexed properties, regardless of whether or not an object
 is an actual Array instance.
 
 Sure.  So? 

You bought of this implementation specific performance point, for some reason.  
I'm just pointing out that your argument goes both ways.  Personally, it sounds 
to me like design by premature optimization.

Allen


Re: What type should .findAll return

2011-11-11 Thread Allen Wirfs-Brock

On Nov 11, 2011, at 7:46 AM, Tab Atkins Jr. wrote:

 On Fri, Nov 11, 2011 at 1:05 AM, Jonas Sicking jo...@sicking.cc wrote:
 And to ensure that the object acts as much as possible as an array it
 should also have it's [[Class]] set to that of an array. This has
 subtle effects on a number of functions. For example it affects what
 Object.toString() and Array.isArray returns, it affects how
 Array.concat behaves, and it affects the JSON serializer.
 
 Could you point me to an explanation of what [[Class]] represents in
 ecmascript?  It's a little hard to search for.
 

this turns out to not be such a simple question see 
https://docs.google.com/document/d/1sSUtri6joyOOh23nVDfMbs1wDS7iDMDUFVVeHeRdSIw/edit?authkey=CI-FopgC
 

[[Class]] has been (mis-??)used for many things.  This is why we want to 
cleanup up for the future.

Allen 



Re: What type should .findAll return

2011-11-11 Thread Allen Wirfs-Brock
Note that the only specialness of Array instances relates to what happens when 
you add new array elements or dynamically change the value of the length 
property.

If the array instance is immutable you can't do any of those things so its 
specialness essentially disappears.

So, if you want the objects to be an immutable, array-like object that inherits 
from array.prototype through an intermediate prototype there really is no 
problem.  A JS programmer could express this today in ES5:

var DOMFindResultProto = Object.create(Array.prototype);  //make it inherit 
from Object.prototype
DOMFondResultProto.someMethod = function O() { ...};
//other prototype methods
//...

function FindResultFactory(nodes) {
   var obj = Object.create(DOMFindResultProto);
   for (var i=0; inodes.length;++i) obj[i]=nodes[i];
   return Object.freeze(obj);
}


However, if you want the object to be mutable and to act like a real array, 
then it has to have the array specialness.  The specialness comes, not from the 
[[Class]] property but from its alternative definitions of 
[[DefineOwnProperty]] (see ES5.1 spec. 15.4.5.1).

In ES.next a JS programmer will be able to easily define such an object.  But 
for ES5 it takes special implementation level intervention.Since this 
capability is going to ultimately be in ES.next I don't see why you couldn't do 
it now, assuming the the engine implementors are all willing to cooperate.

Basically, you would specify that the [[Prototype]] of the instances inherits 
from Array.prototype and that the instances use the [[DefineOwnProperty]] 
specification from ES5 section 15.4.5.1.

In either case, you would be specifying a new kind of ES native object rather 
than a host object'.

BTW, I think that either the immutable or mutable approach would work.  
However, since the collection is not live I don't see why you would really 
care whether or not a client mutated it.  If they want to process it by 
deleting elements after they are examined, so what?

Allen






On Nov 11, 2011, at 12:20 PM, Boris Zbarsky wrote:

 On 11/11/11 10:05 PM, Jonas Sicking wrote:
 In other words, the returned object is exactly what you'd get if you did:
 
 a = new Array;
 a.__proto__ = [some type].prototype;
 [some type].prototype.__proto__ = Array.prototype;
 
 For what it's worth, at least some JITs deoptimize |a| if you do that. We'd 
 probably need to do something to make sure that _doesn't_ happen in this 
 case, right?
 
 -Boris
 




Re: What type should .findAll return

2011-11-11 Thread Allen Wirfs-Brock

On Nov 11, 2011, at 2:16 PM, Jonas Sicking wrote:

 On Fri, Nov 11, 2011 at 1:22 PM, Allen Wirfs-Brock
 al...@wirfs-brock.com wrote:
 
 BTW, I think that either the immutable or mutable approach would work.  
 However, since the collection is not live I don't see why you would really 
 care whether or not a client mutated it.  If they want to process it by 
 deleting elements after they are examined, so what?
 
 Exactly, this is why I'm proposing that it should be mutable.
 
 This does still leave the problem of making Array.filter(myNodeArray,
 function(el) { ... }) work though. I.e. I think we'd like it to
 return a NodeArray rather than a plain Array.

This is a problem for ES=5.  Filter and all the other similar Array.prototype 
functions are specified to produce an object created as if by calling: new 
Array();

I have a scheme that we can probably get in place for ES.next that would allow 
filter and friends to produce NodeArray's for you, but I don't see how that 
helps right now.

 
 More importantly, we want myNodeArray.filter(function(el){ ... }) to
 return a NodeArray. This would be doable by putting a special version
 of filter on NodeArray.prototype which would shadow
 Array.prototype.filter.


It isn't just filter that creates new instances that you would probably want to 
be NodeArrays. Also at least(I might have missed other when I just checkeds): 
concat, slice, splice, map

Over-riding them explicitly for NodeArray would be an immediate fix, but 
wouldn't solve the problem for future additions to Array.prototype.  However, 
if you assume that ES.next is going to provide the needed extension mechanism 
then you should also assume that it will use it for any new Array.prototype 
methods and they should pretty much just work.

 This would make myNodeArray.filter work, but
 not Array.filter.

An inherent problem with this approach. But if your NodeArrays supplies 
correctly working over-rides there is probably little reason somebody would try 
to use the Array.prototype versions  with NodeArrays.

I don't see a way around this short of modifying the specification of the 
Array.prototype methods.  That seems like a job for ES.next rather than a DOM 
spec.

 
 I'm happy to start a separate thread on es-discuss about this, but I'm
 worried that it'll fragment the current thread.

In theory, public-script-coord exists for exactly this sort of discussion and 
the ESdiscuss people who care should be subscripted. Rather than starting a new 
thread, perhaps should should just post to es-discuss a pointer to this thread.

Allen




Re: What type should .findAll return

2011-11-11 Thread Allen Wirfs-Brock

On Nov 11, 2011, at 3:57 PM, Jonas Sicking wrote:

 On Fri, Nov 11, 2011 at 3:07 PM, Allen Wirfs-Brock
 al...@wirfs-brock.com wrote:
 
 ...
 
 This is a problem for ES=5.  Filter and all the other similar 
 Array.prototype functions are specified to produce an object created as if 
 by calling: new Array();
 
 I have a scheme that we can probably get in place for ES.next that would 
 allow filter and friends to produce NodeArray's for you, but I don't see how 
 that helps right now.
 
 Well, if we can get implementations to implement this new scheme for
 the existing filter-like functions at the same time as they implement
 .findAll, then we should be golden.

the scheme depends upon other Es.next features including private names. It 
isn't clear that if you start pulling that thread how far it extends. But 
perhaps, it might fly...


 ...
 
 This would make myNodeArray.filter work, but
 not Array.filter.
 
 An inherent problem with this approach. But if your NodeArrays supplies 
 correctly working over-rides there is probably little reason somebody would 
 try to use the Array.prototype versions  with NodeArrays.
 
 Note that I was saying Array.filter and not Array.prototype.filter. My
 assumption was that if people call Array.prototype with an Array as
 the first argument, they would also do so with a NodeArray as first
 argument.

Array.filter and friends (in contrast to Array.prototype.filter) are not in ES5 
and, so far, have not made the ES.next cut.  Personally, I'm not a bit fan of 
them for exactly this reason.  If you are building object-oriented class 
hierarchies you want to use methods, not global function that can't easily be 
virtually dispatched.  

Allen

Re: Global variables and id lookup for elements

2011-07-20 Thread Allen Wirfs-Brock
To follow up, this issue is https://bugs.ecmascript.org/show_bug.cgi?id=78 

On Jul 19, 2011, at 7:44 PM, Allen Wirfs-Brock wrote:

 
 On Jul 19, 2011, at 6:47 PM, Boris Zbarsky wrote:
 
 On 7/19/11 7:43 PM, Ian Hickson wrote:
 On Thu, 28 Apr 2011, Magnus Kristiansen wrote:
 
 Context: http://krijnhoetmer.nl/irc-logs/whatwg/20110428#l-707
 
 Current browsers disagree about how to handlediv
 id=x/divscriptvar x;/script. Webkit browsers leave x pointing to
 the div, whereas IE, Firefox and Opera make x undefined [1]. (There is
 content that depends on x being undefined, but I don't have any links
 handy right now.)
 
 My reading of the relevant specs (es5 section 10, WebIDL 4.5.3, HTML
 6.2.4) supports the Webkit behavior
 
 Is this still something I should do, or did this get resolved using
 another solution?
 
 Unclear.  For one thing, the ES5 spec on this has changed, and there is no 
 public draft with the errata yet; they're only present in the form of 
 e-mails.  That makes it hard for me to say at this point whether the above 
 claim is even true.
 
 That said, even if we ignore the behavior of var, that leaves open questions 
 about what happens on assignment, etc.  I do think Cameron has done a bunch 
 of testing of this stuff recently, and there was a data table that 
 summarized the results somewhere.  You should probably just talk to him 
 about it.
 
 There we two bugs in this regard in the original ES5 spec, one related to 
 function declarations and the other related to var declarations.  Functions 
 were partially fixed in the ES5.1 specification but that fix still had a 
 problem in that it did a GetProperty rather than a GetOwnProperty  when check 
 to see if a function has been declared on the global object.  None of the 
 corresponding fix for vars make it into the 5.1 spec. Both 5.1 corrections 
 were described by me on the es5-discuss mailing list this past January.  
 Those fixes will be included in the first Errata for ES5.1.
 
 The relevant message with the algorithm correction is 
 https://mail.mozilla.org/pipermail/es5-discuss/2011-January/003882.html 
 
 Allen



Re: Global variables and id lookup for elements

2011-07-19 Thread Allen Wirfs-Brock

On Jul 19, 2011, at 6:47 PM, Boris Zbarsky wrote:

 On 7/19/11 7:43 PM, Ian Hickson wrote:
 On Thu, 28 Apr 2011, Magnus Kristiansen wrote:
 
 Context: http://krijnhoetmer.nl/irc-logs/whatwg/20110428#l-707
 
 Current browsers disagree about how to handlediv
 id=x/divscriptvar x;/script. Webkit browsers leave x pointing to
 the div, whereas IE, Firefox and Opera make x undefined [1]. (There is
 content that depends on x being undefined, but I don't have any links
 handy right now.)
 
 My reading of the relevant specs (es5 section 10, WebIDL 4.5.3, HTML
 6.2.4) supports the Webkit behavior
 
 Is this still something I should do, or did this get resolved using
 another solution?
 
 Unclear.  For one thing, the ES5 spec on this has changed, and there is no 
 public draft with the errata yet; they're only present in the form of 
 e-mails.  That makes it hard for me to say at this point whether the above 
 claim is even true.
 
 That said, even if we ignore the behavior of var, that leaves open questions 
 about what happens on assignment, etc.  I do think Cameron has done a bunch 
 of testing of this stuff recently, and there was a data table that summarized 
 the results somewhere.  You should probably just talk to him about it.

There we two bugs in this regard in the original ES5 spec, one related to 
function declarations and the other related to var declarations.  Functions 
were partially fixed in the ES5.1 specification but that fix still had a 
problem in that it did a GetProperty rather than a GetOwnProperty  when check 
to see if a function has been declared on the global object.  None of the 
corresponding fix for vars make it into the 5.1 spec. Both 5.1 corrections were 
described by me on the es5-discuss mailing list this past January.  Those fixes 
will be included in the first Errata for ES5.1.

The relevant message with the algorithm correction is 
https://mail.mozilla.org/pipermail/es5-discuss/2011-January/003882.html 

Allen

Re: [WebIDL] Exceptions

2011-07-06 Thread Allen Wirfs-Brock

On Jul 6, 2011, at 5:05 PM, Jonas Sicking wrote:

 On Wed, Jul 6, 2011 at 2:23 PM, Aryeh Gregor simetrical+...@gmail.com wrote:
 On Wed, Jul 6, 2011 at 7:06 AM, Anne van Kesteren ann...@opera.com wrote:
 So with Web IDL going to Last Call does this mean that the exception model
 outlined in http://www.w3.org/Bugs/Public/show_bug.cgi?id=10623#c8 is the
 way forward? I.e. we introduce new exception interfaces in DOM Core for all
 the different exception types and update all other specifications that use
 DOM Core to dispatch those exceptions instead (and they are somewhat
 backwards compatible because they inherit from DOMException and therefore
 still have the code member).
 
 I guess there is no particular rush on this; I am mainly wondering whether
 other editors are aware of this change and agree with it.
 
 The thing I don't like about this proposal is that it encourages
 authors to use e instanceof IndexSizeError or similar.  This will
 work 98% of the time and then fail in an extremely mysterious way when
 multiple globals are involved.  All you need is the exception to be
 thrown by something in an iframe for whatever reason.
 
 Moreover, I don't even think behavior in that case is defined.  If I
 call foo.appendChild(bar) and it throws, is the exception from the
 window where the method was called, or the one foo is associated with,
 or the one bar is associated with?  Browsers other than Gecko seem to
 agree it's the one foo is associated with
 (http://software.hixie.ch/utilities/js/live-dom-viewer/saved/1064),
 and Gecko is just buggy, but is this specced anywhere?  I don't see it
 in DOM Core.
 
 I don't see why we need the extra classes.  What's the advantage over
 just adding the .name attribute, or something equivalent, and not
 adding new classes?  Just consistency with ES, or something else too?
 
 This is indeed a good point. The main reason for me was consistency
 with ES. But I'm not sure why ES was designed the way it is. Generally
 it seems like multiple globals wasn't kept in mind a lot when ES was
 designed, though obviously this is a problem for ES on the web.
 
 Would love to hear from ES people that surely has spent more time
 thinking about exceptions than me.
 
 / Jonas
 


From an Object-oriented design perspective I always discourage use of 
instanceof tests.  The problem mentioned here by a Aryeh is a good example why, 
you may have semantically equivalent objects that are instances of different 
classes in different hierarchies.  This is particularly true for dynamic 
languages such as ES.  Given that DOM Core is explicitly an environment that 
provides multiple global objects (and hence multiple semantically equivalent 
objects) it seems particularly unwise for it to depend upon or even recommend 
using such tests. 

ECMAScript itself does not have a very rich set of exception classes and its 
hierarchy is essentially flat. If you ignore instanceof testing then all the 
specialized exception constructors (TypeError, RangeError, ReferenceError, 
etc.) really provide is the ability to say:
 throw RangeError(my message);
instead of
 var excpt = Error(my message);
 excpt.name = RangeError;
 throw excpt;
In other words, a more concise way to set the name property of a new exception 
that is about to be thrown.  Given that the various DOMErrors are thrown by the 
DOM implementation, I don't think that ease of throwing is a relevant issues. 

As a further point of reference I've designed several exception hierarchies for 
a couple of languages and used them in many others.  My experience is that 
while some developers love to put time into designing elaborate hierarchies of 
exceptions, in practice hierarchical exception structures are seldom exploited 
by actual application code. A few flat exceptions are more useful then a 
complex hierarchy that nobody ever remembers.

In the case of the the DOM I think a single exception type where different 
kinds of exceptions are discriminated by their name property would be much 
better then a bunch of frame dependent exception constructors.

I'd much prefer to see code that looks like:
 try {doSomeDOMStuff() }
 catch (e) {
switch (e.name) {
 case NoNotificationAllowedError: ...; break;
 case HierarchyRequestError: ...; break;
 default: throw e
  }
 }

 rather than:

 try {doSomeDOMStuff() }
 catch (e) {
if (e instanceof NoNotificationAllowError) {...}
else if (e instanceof HierarchyRequestError) {...}
   else throw e;
 }

The latter invites cross-frame multiple global issues.  The former is totally 
immune to them. 







RE: Web IDL Garden Hose

2009-09-28 Thread Allen Wirfs-Brock
-Original Message-
From: es-discuss-boun...@mozilla.org [mailto:es-discuss-
boun...@mozilla.org] On Behalf Of Robin Berjon

 There is no old version.

Right, this is v1. What previous W3C API specifications had relied on
was either OMG IDL, or the common lore understanding that people were
familiar with this way of expressing APIs, so they'd get it right.
We're trying to do a bit better than that.


The primary concern of TC39 members is with the WebIDL ECMAScript bindings.  I 
haven't yet heard any particular concerns from TC9 about WebIDL as an abstract 
language independent interface specification language. Since W3C seems 
committed to defining language independent APIs, I would think that the 
language independent portion of the WebIDL spec. would be the only possible 
blocker to other new specs.

It seems like this might be a good reason to decouple the specification of the 
actual WebIDL language from the specification of any of its language bindings.

Allen Wirfs-Brock
Microsoft




RE: Web IDL Garden Hose (was: ECMA TC 39 / W3C HTML and WebApps WG coordination)

2009-09-26 Thread Allen Wirfs-Brock
-Original Message-
From: es-discuss-boun...@mozilla.org [mailto:es-discuss-
boun...@mozilla.org] On Behalf Of Yehuda Katz

Another way to put my earlier concern is: It's impossible to write a
conforming JS engine that browsers will want to use by only following
the ES spec - since there's additional, un-speced, behavior that isn't
in ES that is necessary in order to construct a browser's DOM.

Consider the following scenario: I write an ECMAScript engine that is
significantly faster than any existing engine by simply following the
ECMAScript spec. A browser maker then wishes to use this engine. This
would be impossible without adding additional (hidden) features to the
engine to support the DOM. There is nothing in the ECMAScript spec
that requires the ability (at the very least) to add native extensions
with arbitrary behavior to the engine.

Is this a requirement ECMA is comfortable with?


No we are not.  This is exactly the heart of our concern. The WebIDL
ECMAScript binding is not simply a mapping of IDL interface onto
standard language features (such as is done for the Java binding).
While it has some of that it also defines an extended ECMAScrpt language
with new semantics. (and I understand this is mostly a reflection
of past (present?) practice of browser implementers).  Essentially,
the semantics of browser ECMAScript has been arbitrarily split into
two independently maintained standards. 

Language design is not primarily about designing individual isolated features.
The hard parts of language design involves the interactions among such
features and typically requires making design trade-offs and alteration to
ensure that all features compose coherently.

If the language specification responsibilities are arbitrarily broken into 
two uncoordinated activities then it is impossible for either to do
the global design that is necessary to have a complete and sound language and
specification.

TC39 has the language design expertise.  W3C has Web API design expertise.
If there are language design issues that must be addressed in order to fully
specify browser ECMAScript (and there are) then those issues need to be
addressed by TC39. Perhaps TC309 has been remiss in the past in addressing
these browser specific language design issues.  If so, it was probably for
historic political and competitive reasons that don't necessarily apply today.
That is what we want to fix.

Allen Wirfs-Brock
Microsoft




RE: Web IDL Garden Hose (was: ECMA TC 39 / W3C HTML and WebApps WG coordination)

2009-09-26 Thread Allen Wirfs-Brock


-Original Message-
From: Maciej Stachowiak [mailto:m...@apple.com]

I expect there are relatiively few such capabilities, and little
interest in depending on new ones, and therefore we do not really have
a general ongoing problem of language design.

 
We have an ongoing problem of language design in that all new language
features must integrate with existing features. Combinatory feature
interactions is one of the larger challenges of language design.

 From a quick scan of WebIDL, I see the following:

1) Catchall getters, putters, deleters, definer.
- Variants that can make the catchall check happen either before
or after normal property lookup.
- General string-based name access and index-only versions.
No comment, I need to come up to speed on the detailed semantic requirements

- Note: I think catchall deleters are used only by Web Storage and
not by other new or legacy interfaces.

Seems like a strong reason to change to the proposed API to eliminate the need 
for
a new ES language extension.

2) Ability to support being called (via [[Call]]) without being a
Function.

Not an issue with the core ES5 semantics.  Most ES3/5 section 15 functions have 
this
characteristic. As long as such WebIDL objects are defined similarly to the 
built-in
function they too can have this characteristic. It may well be useful to 
introduce a
mechanism defining such pure functions in the language but it probably isn't 
necessary
to proceed with the WebIDL binding.  The important thing to try to avoid is 
specify
a custom [[Call]]


3) Ability to support being invoked a constructor (via [[Construct]])
without being a Function.

Essentially same as 2 although the standard [[Construct]] requires a [[Call]] 
so this
may need some more thought.

4) Ability to support instanceof checking (via [[HasInstance]])
without being a constructor (so myElement instanceof HTMLElement works).

Possibly the specification of the instanceof operator needs to be made 
extensible

5) Ability to have [[Construct]] do something different than [[Call]]
instead of treating it as a [[Call]] with a freshly allocated Object
passed as this.

Similar to 4 regarding extensibility.  At least one recent harmony strawman 
proposal is
moving in a direction that may be relevent to 4 and 5.
See http://wiki.ecmascript.org/doku.php?id=strawman:obj_initialiser_constructors
 



Tentatively, I think all other semantics of Web IDL interfaces can be
implemented in pure ES5.

Regards,
Maciej





RE: ECMA TC 39 / W3C HTML and WebApps WG coordination

2009-09-25 Thread Allen Wirfs-Brock
+1

-Original Message-
From: es-discuss-boun...@mozilla.org [mailto:es-discuss-
boun...@mozilla.org] On Behalf Of Brendan Eich
Sent: Friday, September 25, 2009 9:56 AM
To: Anne van Kesteren
Cc: public-webapps@w3.org; HTML WG; es-discuss
Subject: Re: ECMA TC 39 / W3C HTML and WebApps WG coordination

Three distinct topics are being mixed up here:

1. Whether to use WebIDL or some unproposed alternative.

2. Whether to use catchall patterns in new WebIDL-defined interfaces.

3. Whether the JS WebIDL bindings should be standardized by Ecma or W3C.

The straw man (0. Whether to remove catchall patterns from existing
WebIDL interfaces required for backward compatibility) is nonsense and
I'm going to ignore it from here on.

My positions are:

1. WebIDL, the bird in the hand (I agree with Sam: go invent something
better, come back when you're done).

2. Don't keep perpetuating catchall patterns, they are confusing for
developers and costly for implementors and static analysis tools, even
if implementable in some future ES edition.

3. Don't care.

I differ from Mark on 3, but that's ok. What is not ok is to waste a
lot of time arguing from divergent premises that need to be unpacked
or else let alone for now, when we could be collaborating on concrete
issues such as split windows, execution model, catchall policing, etc.

Mark's Joe with his JoeLang bindings for WebIDL vs. Anne's point about
the primacy of JavaScript bindings for WebIDL-defined interfaces is
not going to lead to rapid agreement on putting the ES WebIDL bindings
in Ecma vs. leaving them in W3C. It's a rathole, IMHO.

Both points of view have merit, but precedent and possession matter
too, and Ecma can't plausibly fork or steal the binding spec. We're
trying to collaborate, so let's get on with that hard work instead of
trying to assail one another with principles that can't encompass the
whole picture.

Hope this helps,

/be
___
es-discuss mailing list
es-disc...@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss




RE: ECMA TC 39 / W3C HTML and WebApps WG coordination

2009-09-25 Thread Allen Wirfs-Brock
-Original Message-
From: es-discuss-boun...@mozilla.org [mailto:es-discuss-
...
But ECMAScript doesn't have a way to distinguish normal property
access from property access via lexical scoping.

In the ES5 specification it does.  Reference that that resolve to property 
accesses
are explicitly distinguished from those that resolve to environment records.  
This includes
object environments such as the global environment and with environments.

It's unclear whether
you could say an object is actually the same but happens to give
different answers for scope chain access and direct property access,
and possibly even different answers depending on which scope chain it
was found in. I would think that strains host object exemptions to the
breaking point.

Accesses to the global object are mediated through a object environment record, 
but the
actual access to the global object's properties take place using internal 
methods [[Get]], [[Put]],
[[DefineOwnProperty]], etc. regardless of whether the access was initiated via 
a direct
property reference or via an environment record.  However, because neither ES3 
or ES5 (except for a
only a couple new requirements) really define or require specific semantics for 
host
object internal methods virtually anything goes. Even behavior that differs 
depending upon
the calling context of the internal method. (although internal methods 
aren't real and
aren't actually called).

When ECMAScript says host object it is really saying arbitrary 
implementation dependent magic
could happen here.

Allen