Re: yield and Promises

2011-10-21 Thread Kris Zyp

On 10/21/2011 11:01 AM, David Herman wrote:

[snip]
But there's more to it than just the interface. You fix a particular 
scheduling semantics when you put deferred functions into the 
language. I'm still learning about the difference between the Deferred 
pattern and the Promises pattern, but the former seems much more 
stateful than the latter: you enqueue another listener onto an 
internal mutable queue.
At least in the Dojo community (and I think Kowal does with Q as well), 
we define a Deferred producer-sider constructor for creating promises, 
with an API that can resolve() or reject() the generated promise. The 
promise is then the consumer-side interface that allows consumer to 
register a callback for the fulfillment or rejection of the promise 
(with a then() method or a variety of other convenience functions). The 
mutable state pattern was used in earlier versions of Dojo, but later we 
switched to an API that keeps promises immutable except for legacy 
methods, as we have reached consensus that mutable promises are bad. 
Thus the terminology difference between mutable state and immutable 
state is simply wrong vs right for us ;).


There are indeed different scheduling semantics to consider. With Dojo 
(and I think jQuery as well), we have considered enqueuing callbacks 
onto the event queue to unviable because historically the only mechanism 
within the browser has been setTimeout (there is no setImmediate or 
process.nextTick available) which has a rather large minimum delay that 
can easily up add to noticeable and unacceptable introduction of delays 
with a chain of a series of promises. Consequently our implementations 
do not enqueue any callbacks for future turns, all callbacks are 
executed in the same turn as the resolution of the promise, and due to 
latency concerns we haven't really felt the freedom to explore other 
scheduling semantics. This scheduling semantic has worked fine for us, 
but I don't mind an alternate one. It looks like kriskowal/q does 
enqueue, using a postMessage hack to enable faster enqueuing on newer 
browsers.


On 10/21/2011 10:34 AM, John J Barton wrote:
Can anyone summarize how these proposals relate to Kris Kowal / Kris 
Zyp / Mark Miller Q library:

https://github.com/kriskowal/q

The proposal was designed such that it should work smoothly with Kowal's 
Q originating promises as well (acting like Q.when). For example, using 
the opening example of delay function from the kriskowal/q readme, one 
could write:


function(){
  return afterOneSecond(yield delay(1000));
}

and it would be effectively the same as (with the possible exception of 
scheduling policies):


function(){
   return Q.when(delay(1000), afterOneSecond);
}

In my experience, reasoning about the code was much easier with Q than 
without Q.  (Not something I found in trying generators).  I found the 
documentation hard to follow since it assumes background I don't have 
and the examples were not interesting to me, but once I tried it I was 
pleasantly surprised. It does have ergonomic issues, an undefined and 
resolved promise work equally well, but I think this may be 
inexperience on my part.




Yes, kriskowal/q is an excellent library.
Thanks,
Kris

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


yield and Promises

2011-10-19 Thread Kris Zyp
The topic of single-frame continuations has been discussed here before, 
with the current state of ES.next moving towards generators that are 
based on and similar to Mozilla's JS1.7 implementation. Generators are a 
powerful addition to the language and welcome (at least by me). However, 
I believe that this still leaves a distinct gap in functionality that 
forces the majority use case for single-frame continuations to be 
handled by libraries. A language feature that OOTB must rely on 
libraries to fulfill the most wanted use cases seems like than ideal.


I believe one could separate these single-frame continuations (or 
coroutines, looking at it from the perspective of the behavior of the 
function) into two categories. There are bottom-up controlled 
continuations, where the caller of the coroutine function controls when 
the function will resume execution. I think is equivalent to a 
generator. Generator functions return an object with an interface for 
resuming the execution (and providing values for the continuation) of 
the coroutine function.


There are are also top-down controlled continuations. Here coroutine 
functions can suspend execution when given an object (typically from one 
of the functions it calls) that provides the interface to resume 
execution. Resuming execution therefore is controlled by values returned 
from callees instead of from the caller. It is worth noting that 
bottom-up controllers can be turned into a top-down controller and vice 
versa with the use of libraries (one can go either way).


I believe that the overwhelming need that is continually and constantly 
expressed and felt in the JS community in terms of handling asynchronous 
activity is fundamentally a cry for top-down controlled single-frame 
continuations (obviously not always stated in such terms, but that is 
the effective need/solution). In terms of an actual code example, 
essentially what is desired is to be able to write functions like:


element.onclick = function(){
// suspend execution after doSomethingAsync() to wait for result
var result = some operator doSomethingAsync();
// resume and do something else
alert(result);
};

Generators directly solve a problem that is much less significant in 
normal JS coding. While it is exciting that generators coupled with 
libraries give us a much better tool for asynchronous use cases (the 
above can be coded with libraryFunction(function*(){...}), my concern is 
that the majority use case is the one that requires libraries rather 
than the minority case, and does not promote interoperability.


Now why should we consider something now when previous alternatives to 
generators have failed to gain traction? Previous proposals have avoided 
a specifying a direct interface on top-down objects to leave the door 
open for different possible interfaces for resuming executions, or 
different possible promise styles. We have wisely deferred to 
libraries when different possible approaches have yet to be explored 
within the JS community. A couple years ago there were numerous 
approaches being explored. However to truly follow through with this 
strategy we should then proceed with language design when convergence 
does in fact take place. A couple years later, I believe the landscape 
has dramatically changed, and we indeed do have significant convergence 
on a promise API with the thenable interface. From Dojo, to jQuery, to 
several server side libraries, and apparently even Windows 8's JS APIs 
(from what I understand) all share an intersection of APIs that include 
a then() method as a method to define a promise and register a callback 
for when a promise is fulfilled (or fails). This is an unusual level of 
convergence for a JS community that is so diverse. I believe this gives 
evidence of well substantiated and tested interface that can be used for 
top-controlled single-frame continuations that can easily be specified, 
understood and used by developers.


My proposal is to allow the use of the yield keyword in standard 
functions (not just generator function*'s) with the following semantics: 
The yield operator is prefix operator that takes a single operand 
(variant of AssignmentExpression, just as within generator function*s).  
When a yield operator is encountered in the execution of a standard 
function (not a generator), the operand value is examined. If the value 
is an object with a then property that is a function (the object is 
AKA promise), the execution will suspend, preserving the context for 
when the execution is resumed. The operand's then function will be 
called with a resume function as the first argument, and a fail 
function as the second argument. If and when the resume function is 
called, execution of the suspended function will resume. The first 
argument of the call to the resume function will be provided as the 
result of the evaluation yield operator within the resumed execution. If 
the fail function is called, 

Re: yield and Promises

2011-10-19 Thread Kris Zyp

On 10/19/2011 12:29 PM, Dean Landolt wrote:

This is a really great idea, Kris! A few comments inline...

[snip]


If the value is not an object with a then property that is a
function, the operand value is the immediate result of the
evaluation of the yield expression and execution is not suspended.

Here is a simple example of usage:
use strict
function delay(ms){
  // a function that will yield for the given milliseconds
yield {
then: function(resume){
setTimeout(resume, ms);
}
}
}



IIUC you're proposing language-level support for promises, right? 
There's no getting around it -- you're spec'ing an interface for the 
unary yield operator to interact with. So why not go all out and have 
the language stratify `then` for you with private names?


That's fine with me.

[snip]


Obviously one could choose a different keyword to use. I'd imagine
await is one alternative. The drawback of yield is that it
does behave differently than a yield in a generator. However,
generators behave quite differently anyway, and top-controlled
yield shares a very important commonality. Using one keyword
means there is only a single operator that can result in a
suspension of execution within a function body, making easier to
spot such occurrences and look for possible points where certain
variants should not be anticipated. Of course it is also nice to
avoid proliferation of keywords and introducing new previously
unreserved keywords.



Any thoughts on how this should interplay with generators? One 
side-effect of overloading yield is that it becomes impossible to wait 
for a promise inside a generator -- is this a feature or a bug?


I think it is a feature, as I don't believe they both forms can be used 
very coherently together in the same function. Consider a separate 
await operator inside a generator. If you execute this operator with 
an unresolved promise, the function is supposed to return (a promise), 
but in a generator when a return is encountered it throws a 
StopIteration. It hardly seems useful to have an (await somePromise) 
immediately halting the generator. If you want to use promises within a 
generator, I believe the correct usage would be to propagate the promise 
out to the generator controller and then yield from there:

function* slowGenerator(){
  while(true){
yield delay(50);
  }
}
let seq = slowGenerator();
yield seq.next();
yield seq.next();


There are also have been suggestions about potentially have
language support for queuing different operations on to promises
(gets, puts, and posts). I don't think proposal precludes such
possibilities (it only precludes the possibility of opaque
promises that have no interface, but the majority of devs I have
talked to are pretty opposed to that idea, so that doesn't seem
like likely possibility).



I assume that if a function that yields, when called with a yield 
prefix, will return a promise -- is this correct?


Yes.
What if there exists a yield in the function but the function returns 
without hitting a yield in the codepath? No promise then?

No promise will be returned.

I believe it is critical that we may maintain a principle of locality 
such that:

(function(){
  if(false){
valid statement
  }
  else return true;
})();
will always return true, regardless of the operators placed within the 
if statement's body.


Thanks,
Kris
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: promises | Communicating Event-Loop Concurrency and Distribution

2011-01-29 Thread Kris Zyp
On 1/28/2011 8:43 PM, Mark S. Miller wrote:
 On Fri, Jan 28, 2011 at 6:05 AM, Kris Zyp k...@sitepen.com
 mailto:k...@sitepen.com wrote:


 Exactly. On the NodeJS mailing list there is constant,
 never-ending stream of messages from (generally new users) who are
 complaining about the pain of callbacks (not from typing some API,
 I have yet to ever get a request for shorter syntax for
 node-promise, which is one of the commonly used helper libraries).
 In fact I think someone recently created another continuation
 library (in the same vein as Narrative, generators, etc.),
 continuing to add to our collective experience in this area. I 
 could look up the address if desired.


 Please do. More collective experience is good.

https://github.com/laverdet/node-fibers

Looking at the language in the docs, this may just use event-loop
stacking to wait or yield.


 However, I think your comparisons show your missing the point of this
 strawman. AFAIK, all the systems you cite, including node-promise
 which implements CommonJS promises/A and your proposed use of shallow
 continuations, are all only for helping express asynchrony within a
 single vat. None of these are about supporting communication to
 objects in separate or remote vats. So none of these should be taken
 as competitors for this strawman. This strawman includes means for
 spawning new vats, talking between the spawner and the spawned, and an
 underspecified-at-the-moment (but see qcomm's Q.makePromise())
 pluggable extension point for extending this semantics over networks.

So Web Workers are the competition? You can build far references on Web
Workers, right?
Kris
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: promises | Communicating Event-Loop Concurrency and Distribution

2011-01-28 Thread Kris Zyp
On 1/27/2011 10:37 PM, Mark S. Miller wrote:
 On Thu, Jan 27, 2011 at 8:23 PM, Kris Zyp k...@sitepen.com
 mailto:k...@sitepen.com wrote:

 This looks like a case of creating in-language for a library. This
 was done with json2.js because it was one of the most widely used
 libraries in existence and similar JSON handlers were used in
 numerous libraries. The ref_send library and sibling Q style
 implementations are not anywhere close to that level of adoption.
 It seems like there are a large number of library functions that
 have proven useful enough (by real usage) ripe for incorporation
 into the language before the Q api. In fact, even with the realm
 of JavaScript promise libraries, this implementation/API, which
 has been discussed and known about for a number of years has not
 become the dominate interface API, there are other approaches that
 have been found more use. While I certainly like and appreciate
 the Q API, I don't think it has proven itself worthy to be added
 to the language.

 The real pain point with promises isn't that it takes too many
 characters to type Q.post. How many bytes does the average
 JavaScript program spend on promise API calls? It negligible, this
 proposal isn't solving any real world problem.


 Hi Kris, I appreciate your other points. However, from my own
 experience having programmed in this style with syntactic support, and
 attempting to bring myself to program in this style without syntactic
 support, I disagree. In this particular case, the syntactic difference
 between

 Q.post(a, foo, [b, c])

 and

a ! foo(b, c)

 makes a tremendous difference in one's willingness to use these
 abstractions, and in one's ability to read and understand code
 expressed in these abstractions. I believe this difference is adequate
 to deter most usage -- even in my experience by people like me, who
 already like these abstractions.

Of course syntax makes a difference, but doesn't that same argument
apply to the thousands of other JavaScript libraries out there? If I
just had syntactical support for my cool library, I know people would
start using it...

On 1/27/2011 10:47 PM, Mike Shaver wrote:
 On Thu, Jan 27, 2011 at 9:37 PM, Mark S. Miller erig...@google.com wrote:
 I don't understand why single-frame continuations are not at least as
 vulnerable to the criticism you make above. They are even more unproven
 among JavaScript programmers than are the concurrency abstractions in the
 strawman. The main benefit of moving support for single-frame
 continuations into the language is notational, which you dismiss above. I
 don't get it.
 We have some experience with single-frame continuations in the Firefox
 code base (and elsewhere such as threads.js from Neil Mix), based on
 our years-old implementation.  The thing *I* like most about it,
 versus working from a callback-based model, is that you can use the
 language's flow control constructs.  Having to turn everything inside
 out via function chaining hurts a lot, even with helper libraries that
 are popular in the node.js community.

 for (var i in hash) {
   doSomethingAsyncAndWait();
 }

 versus gross hackery like breaking out of the enumeration every time
 and deleting the just-processed property.  Nothing that code to be
 written out of order, or shattered by function decomposition and glued
 back together in the programmers mind, will be as satisfying I fear.

Exactly. On the NodeJS mailing list there is constant, never-ending
stream of messages from (generally new users) who are complaining about
the pain of callbacks (not from typing some API, I have yet to ever get
a request for shorter syntax for node-promise, which is one of the
commonly used helper libraries). In fact I think someone recently
created another continuation library (in the same vein as Narrative,
generators, etc.), continuing to add to our collective experience in
this area. I  could look up the address if desired.

Thanks,
Kris
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: promises | Communicating Event-Loop Concurrency and Distribution

2011-01-27 Thread Kris Zyp
This looks like a case of creating in-language for a library. This was
done with json2.js because it was one of the most widely used libraries
in existence and similar JSON handlers were used in numerous libraries.
The ref_send library and sibling Q style implementations are not
anywhere close to that level of adoption. It seems like there are a
large number of library functions that have proven useful enough (by
real usage) ripe for incorporation into the language before the Q api.
In fact, even with the realm of JavaScript promise libraries, this
implementation/API, which has been discussed and known about for a
number of years has not become the dominate interface API, there are
other approaches that have been found more use. While I certainly like
and appreciate the Q API, I don't think it has proven itself worthy to
be added to the language.

The real pain point with promises isn't that it takes too many
characters to type Q.post. How many bytes does the average JavaScript
program spend on promise API calls? It negligible, this proposal isn't
solving any real world problem. The main challenge and overhead of
promises, or any CPS-esque code flow is the complexity and overhead of
continuing flow through callbacks. This is why there are proposals for
shorter anonymous functions and single-frame continuations. Single frame
continuations provide the tools for building cleaner, simpler code flow
with promises or any other callback style mechanism without
standardizing on a single library. Like Brendan mentioned in last blog
post (well, I guess it came from Guy Steele), good language empowers
users to build on it, not stifle them with single library approach.

Thanks,
Kris

On 1/27/2011 5:09 PM, Mark S. Miller wrote:
 On Thu, Jan 27, 2011 at 8:00 AM, Irakli Gozalishvili rfo...@gmail.com
 mailto:rfo...@gmail.com wrote:

 Hi,

 I was curious to know what is the state of the following proposal:
 http://wiki.ecmascript.org/doku.php?id=strawman:concurrency


 It's on the agenda for the upcoming March meeting. I was planning to
 do some more work on it before declaring it ready for discussion. But
 since you raise it, I'm happy enough with it in its current state. I
 can clarify my remaining issues with it as discussion proceeds. So

  This page is now ready for discussion.

 I do expect that this strawman is too large to be accepted into
 ES-next as a whole. To get some of it accepted -- syntactic sugar +
 some supporting semantics -- we need to find a clean boundary between
 what kernel and extension mechanism the platform needs to provide vs
 the remaining functionality that should be provided by libraries using
 these extension mechanisms. 

 For example, Tyler's web_send uses such an extension API in ref_send
 to stretch these operations onto the network, by mapping them onto
 JSON/RESTful HTTPS operations. Kris Kowal's qcomm library uses
 Q.makePromise (like that proposed above) to stretch these operations
 over WebSockets, whose connection-oriented semantics enables a better
 mapping at the price of more specialized usage. I hope that Kevin
 Reid's caja-captp can also be reformulated as a library extending the
 Q API.

 (See links to ref_send, web_send, qcomm, and caja-captp at the bottom
 of the strawman page.)
  



 I do believe that having ES native promises could provide
 drastically better alternative for writing async code in
 comparison to currently popular nested callback style. Also even
 though there are few implementations of Q API adoption is still
 low IMO that's due to non obvious and verbose syntax. Syntactic
 sugar described in the proposal really makes a lot of difference.
 Also I don't see proposal for `Q.when` syntax and would love to
 know what is the a plan for that.


 Given a lightweight closure syntax, I don't think Q.when actually
 needs any further syntactic sugar. For example, here's the asyncAnd
 example from Figure 18.1 of http://erights.org/talks/thesis/ in JS
 with this strawman + destructuring + the lightweight closure syntax
 from HOBD (Harmony of Brendan's Dreams):

 #asyncAnd(answerPs) {
 let countDown = answerPs.length;
 if (countDown === 0) { return true; }
 const {promise, resolver} = Q.defer();

 answerPs.forEach(#(answerP) {
 Q.when(answerP, #(answer) {
 if (answer) {
 if (--countDown === 0) { resolver(true); }
 } else {
 resolver(false);
 }
 }, #(ex) {
 resolver(Q.reject(ex));
 });
 });
 return promise;
 }

 The original asyncAnd in Figure 18.1 is in E, whose syntax was
 designed without legacy burden to make such code pleasant.
 Nevertheless, I don't think the code above suffers much in comparison.

 Of course, if you have a suggestion of how a sugared Q.when can
 improve on this enough to be worth the cost 

Re: New private names proposal

2010-12-16 Thread Kris Zyp

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
This sounds great, but doesn't this kind of violate referential
transparency? The following function has always worked as expected:
function foo(){
  var obj = {bar:hello}; // assuming quoting names are strings
  alert(obj.bar);
}
foo();
until is put in a context (or even just concatenated with?) with a
private bar; declaration. Changing the behavior of property
identifiers seems like an awkwardly complicating addition to EcmaScript.

Couldn't the goals of this be achieved by having a Name constructor
(albiet less convenient syntax, since you have to use obj[name],
perhaps that is what you are addressing) or having private name create
a name Name (to be used like obj[name])?
Kris


On 12/11/2010 4:58 PM, Allen Wirfs-Brock wrote:
 On the wiki, I've posted a new strawman Private Name
 proposal http://wiki.ecmascript.org/doku.php?id=strawman:private_names .
 It replaces Dave and Sam's original Names strawman and was developed
 in consultation with them. This is a significant revision to the
 original proposal but builds upon some of the same base ideas.

 In reading this strawman it's important to understand that its
 concept of private is quite different from what you may be use to
 from C++ or Java. In those languages private is an attribute of
 a member (field or method) of an class. It means that the member is
 only accessible to other members of the same class (ignoring what
 can be done via reflection). This model is not a particularly good
 match to the JavaScript object model where the structure of an
 object is much more dynamic and method functions can be dynamically
 associated or disassociated with an object and shared by many
 different kinds of objects.

 In this proposal, private is an attribute of a property name,
 rather than of an actual property. Any code that has access to a
 private name can use that name to create or access a property of
 any object. It is accessibility to the name that is controlled
 rather than accessibility to the property. This seems to fit more
 naturally with JavaScript's dynamic object construction patterns
 and without really changing the fundamental JavaScript object
 model it enables JavaScript programmers to create a number of
 different information hiding abstractions.

 Please read the proposal and let's start the discussion.

 Allen




 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAk0Kh1oACgkQ9VpNnHc4zAzCbgCcCl6kh77DCuCmd1YRw7hqC/Ml
LwcAn0V7Wm0Yr7FzGW618atYT4c7kVHO
=eoj3
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: natively negotiating sync vs. async...without callbacks

2010-12-09 Thread Kris Zyp

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
The generators from JS 1.7 are too specific to provide much help with
promises, IMO. Adding a yield operator fundamentally alters the
semantics of the entire surrounding function, violating the principle
of locality. Consequently you need special mechanisms (like Neil's
library) to even call an async function, and you can't return values
either (with the return operator). Solving these problems was the
point behind the shallow continuation proposal that we worked on. Does
the generator strawman build on that work, or does it still fail to
preserve locality?

My understanding of Kyle's proposal was that he did not want to
introduce real continuations, but this was more of a syntax for a
specialized callback chain, reducing the verbosity of anon functions,
callback registration. I'd prefer more of a generalized solution for
reducing the size of anon functions (which has been discussed to great
lengths here in the lambda threads). However, my belief on async is
that the verbosity is only part of the pain, working with async within
various control flows (loops, branches, etc) is more of a burden,
hence the need for continuations (shallow/single frame, of course).
Kris

On 12/9/2010 8:55 AM, David Herman wrote:
 I pretty much abandoned that line of investigation with the
 conclusion that generators:

 http://wiki.ecmascript.org/doku.php?id=strawman:generators

 are a good (and well-tested, in Python and SpiderMonkey) design for
 single-frame continuations. They hang together well; in particular,
 they don't have the issues with `finally' that some of the
 alternatives I talked about do. Moreover, the continuation-capture
 mechanisms based on call/cc or shift/reset require additional power
 in the VM to essentially tail-call their argument expression. When I
 tried prototyping this in SpiderMonkey, I found this to be one of
 the biggest challenges -- and that was just in the straight-up
 interpreter, not in the tracing JIT or method JIT.

 Generators work well for lightweight concurrency. As a proof of
 concept, I put together a little library of tasks based on generators:

 http://github.com/dherman/jstask

 Somebody reminded me that Neil Mix had written a very similar
 library several years ago, called Thread.js:

 http://www.neilmix.com/2007/02/07/threading-in-javascript-17/

 and there's another library called Er.js that built off of that to
 create some Erlang-like abstractions:

 http://www.beatniksoftware.com/erjs/

 Dave

 On Dec 8, 2010, at 11:36 PM, Tom Van Cutsem wrote:

 The spirit of the proposal is that this special type of
 statement be a linear sequence of function executions (as
 opposed to nested function-reference callbacks delegating
 execution to other code).

 The special behavior is that in between each part/expression of
 the statement, evaluation of the statement itself (NOT the rest
 of the program) may be suspended until the previous
 part/expression is fulfilled. This would conceptually be like a
 yield/continuation localized to ONLY the statement in question,
 not affecting the linear execution of the rest of the program.


 This reminds me of a proposal by Kris Zyp a couple of months ago
 (single frame continuations)
 https://mail.mozilla.org/pipermail/es-discuss/2010-March/010865.html

 I don't think that discussion lead to a clear outcome, but it's
 definitely related, both in terms of goals as well as in mechanism.
 I also recall it prompted Dave Herman to sketch the design space of
 (single-frame) continuations for JS:
 https://mail.mozilla.org/pipermail/es-discuss/2010-April/010894.html

 Cheers,
 Tom
 ___
 es-discuss mailing list
 es-discuss@mozilla.org mailto:es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss



 ___
 es-discuss mailing list
 es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAk0BA5kACgkQ9VpNnHc4zAy7nwCeJxL8Or+BUkYzfAi46EKEQG+O
nGEAn0nCErWiI5mbunUwD860Czeof1bt
=fHWD
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Simple Modules and Current Modules

2010-11-05 Thread Kris Zyp

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
Another idea that might provide the extensibility to deal with loading
various resources and alternate/older modules, what if we added the
ability to specify the loader to use for a module:

module myApp {
  module Template = load template.html with text-loader.js; //
load a template text
  module Messages = load messages.json with i18n.js; // load some
messages
  module OldModule = load old-module.js with commonjs-loader.js;
  module NewShiny = load a-harmony-module.js; // use the default
native harmony module loader
...

I realize this may involve providing first-class objects to a
second-class module system, but that doesn't seem impossible.
Thanks,
Kris
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkzUTCcACgkQ9VpNnHc4zAxWFQCdGFHfs2ZuI8EUYX/KzHNW/2I9
bigAnj3bS09KAYR0GzaPUs7P4YgTkfoy
=59GQ
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Simple Modules and Current Modules

2010-11-05 Thread Kris Zyp

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


On 11/5/2010 12:31 PM, Sam Tobin-Hochstadt wrote:
 On Fri, Nov 5, 2010 at 2:25 PM, Kris Zyp k...@sitepen.com wrote:

 -BEGIN PGP SIGNED MESSAGE- Hash: SHA1

 Another idea that might provide the extensibility to deal with
 loading various resources and alternate/older modules, what if we
 added the ability to specify the loader to use for a module:

 This is of course possible with a module loader, and these are
 interesting and valuable use cases.

Right, but how does a module express a dependence on a particular
module loader? The problem is that it requires knowledge of what
module loader is needed, and this must be manually loaded before hand,
which is problem that modules is supposed to solve.

 module myApp { module Template = load template.html with
 text-loader.js; // load a template text module Messages = load
 messages.json with i18n.js; // load some messages module
 OldModule = load old-module.js with commonjs-loader.js;
 module NewShiny = load a-harmony-module.js; // use the default
 native harmony module loader ...

 But I don't think using the declarative syntax for this is a good
 idea - there's no static scope here; instead, there's arbitrary
 code execution and evaluation. Additionally, this makes
 everything synchronous.

Doesn't the default module loader get executed normally? Why do
alternate module loaders change that? Why does it have to be
synchronous, the module loader API is asynchronous?

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkzUUVYACgkQ9VpNnHc4zAzz8wCgk1Jkt/fDwgSJwgJgXOOZZuzr
g0AAmQHmZbO0ywFmryUcoiyESLIS50Cd
=CnPI
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Simple Modules and Current Modules

2010-11-05 Thread Kris Zyp

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


On 11/5/2010 12:58 PM, Brendan Eich wrote:
 [snip]
 The declarative syntax, as I pointed out earlier in this thread, is
used to prefetch, so there is no violation of JS's run-to-completion model.

 What you seem to be thinking, in your module myApp example, is that
everything in between the nested module Template, module NewShiny, etc.
lines is either run before those modules load (so cannot use anything
loaded, so violates the JS execution model), or is automatically
transformed into callbacks or even threaded, so that one could call
OldModule.foo(); right after the module OldModule = ... line (btw, the
load pseudo-keyword is being dropped), and not get a reference error.

 But Harmony module syntax exists precisely to preserve the step at a
time, no hidden threading or CPS'ing or event loop nesting, execution
model, by enabling prefetching of static dependencies.

 Dynamic dependencies require use of the module loader API, as Sam said.
The static module declarations use the default module loader under the
hood, but ahead of execution of the script that contains those module
declarations. The script is parsed and the modules prefetched before
execution starts.

The prefetching is not performed by the default module loader provided
by the host? Can't that default module loader be overriden? When you
use a custom module loader why isn't arbitrary code executed when the
JS VM requests a module needed by the loading module?

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkzUVkkACgkQ9VpNnHc4zAxkxQCeM5mIrcgpa1jvF2figilZI/zT
RmIAn22y0f2ARJspYFRPffy8nW4Zl4C7
=a4xO
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Simple Modules and Current Modules

2010-11-04 Thread Kris Zyp
 new features and focused on small
incremental changes that were desperately needed to compose important
functionality. Maybe the same principle should be applied again. At
least in the area of modules, if we focused on small composable
feature(s) that are truly needed to build module systems on top of
EcmaScript, we can introduce a much simpler addition, and ensure that
we get it right, yet still provide the build blocks for modules with
security. Namely, the true nugget from the module system is the
ability to evaluate scripts with lexical scoping, and ensure a real
filename is associated with the script (for real stack traces,
debugging, etc). If ES6 simply had such API, it would be tremendous
step forward for loading securable modules in EcmaScript, with minimal
complexity cost, no new syntax, smooth transitioning, and the
composibility to build real-life module systems. When considering the
implementation cost of simple modules plus truly critical new
features like proxies, weakmaps, and so on, the complexity adds up. A
simple lexical scoping improved evaluation API could provide a much
better return on our investment, at least for any near future edition.

Anyway, I thought I would throw out these thoughts, if more formal
proposals on anything would help, I'd be glad to put something together.

[1] http://wiki.commonjs.org/wiki/Modules/AsynchronousDefinition

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkzSs64ACgkQ9VpNnHc4zAwOdQCfaNxLlO0lprkKOe/zppq9OYB8
omQAoI+6dHMyAoZlsFz4u95v6bU1yDhX
=jdH3
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Simple Modules and Current Modules

2010-11-04 Thread Kris Zyp

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


On 11/4/2010 8:02 AM, Sam Tobin-Hochstadt wrote:
 On Thu, Nov 4, 2010 at 9:22 AM, Kris Zyp k...@sitepen.com wrote:

 I believe it should be a requirement that harmony
 modules (or at least a subset of them) should be desugarable into ES5
 code, such that the desugared code is still a working module in
 harmony engines that support native modules.

 I think that this requirement would mean that modules in Harmony would
 be unable to provide all of the things we want - in particular, true
 lexical scoping of modules and the names they export and import.
For imported names, I don't understand why lexical scoping fails to
work just because a function call is involved. For exports, I thought
the key to linking was freezing objects, which is readily available in
ES5.


 [lots on the integration of modules with existing systems]

 I think the area of integration with the work people are already doing
 in RequireJS, CommonJS, Dojo, and other systems is very important. As
 the Simple Modules proposal continues to evolve, Dave and I will work
 hard to make this integration and future migration as easy and
 transparent as possible.
That's to great to hear. My objection below is due to the fact that I
currently don't see how this proposal brings an integration path with
real benefits to existing module systems or provides a compelling
alternative to them (other than the security benefits of lexical
scoping and improved eval, which is awesome). If it I could be
illuminated on the benefits and how to deal with the problems I
mentioned, I am sure I would be more favorable towards it.


 But,

 At least in the area of modules, if we focused on small composable
 feature(s) that are truly needed to build module systems on top of
 EcmaScript, we can introduce a much simpler addition, and ensure that
 we get it right, yet still provide the build blocks for modules with
 security.

 I strongly disagree with this. There is no consensus whatsoever as to
 what those small composable features are.
Of course not, I just suggested it! Are you suggesting that reaching
consensus on large complicated features is easier than small simple
features?

 Further, there is no
 precedent in other programming languages for building modules out of
 other features, especially not using runtime evaluation.

What about EcmaScript!? All current JS module systems build on
existing features.

 I think that
 trying to be way out in front of the rest of programming language
 world in this area would be a mistake.
This isn't out in front, quite the opposite, I am suggesting
conservative steps forward.

 Namely, the true nugget from the module system is the
 ability to evaluate scripts with lexical scoping, and ensure a real
 filename is associated with the script (for real stack traces,
 debugging, etc).

 I disagree that this is the true nugget. Modules are about much
 more than 'eval', and certainly much more than associating a real
 filename (hopeless in the general case anyway).

Of course there is much more. But from real-world use of today's
module systems, this is one of the key missing build blocks.

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkzS40oACgkQ9VpNnHc4zAyIjwCcCzIs7QRtB5lus1aSPPip1gI0
LckAnib2GvNZZA8PlXSjI0mW+apth6P3
=tV2v
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Simple Modules and Current Modules

2010-11-04 Thread Kris Zyp

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


On 11/4/2010 11:13 AM, Alex Russell wrote:

 On Nov 4, 2010, at 6:22 AM, Kris Zyp wrote:


 I've been meaning to make some comments about simple modules and
 in last minute with Brendan, he noted that he wanted to work with
 CommonJS to make it really good, so... I think one of the key
 needs that seems to be unaddressed from the current simple modules
 proposal is the need to have a feasible transition from today's
 modules to native modules.

 I disagree. The design of the new system should not be
 predicated
 on what current practice looks like. We should use the entire
 available design space to construct the best module system for
 JavaScript possible. If that's incompatible with current code,
 that's OK.

I agree. Well, except that we shouldn't be blind to what has been
necessary to build real world applications in the past, I hope you are
not suggesting that. To be clear though, I don't have any objections
to new syntax, and I don't have any objection to taking advantage of
the entire design space. If that is incompatible, so be it. But if the
current proposal can desugar and provide a smooth transition, why
shouldn't it? So far I am not seeing the essential incompatibility.
Thanks,
Kris
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkzS60UACgkQ9VpNnHc4zAwbeQCfaNPSNaF9mdjFYrh7dZsBvKzt
2FIAn1NkHGsxTTgC14rXnDcAjh3+VWN/
=ZsSE
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Single frame continuations proposal

2010-04-17 Thread Kris Zyp
. The biggest problem with the semantics you propose is that it fails
 to provide a means for resuming a continuation with a thrown error.

 I agree it's an omission, but disagree that it's the biggest problem.
It's fixable in a couple different ways. Your suggestion is one; another
is to represent a continuation as an object with both `send' and `throw'
methods, a la generators.

Yes, that sounds great to me. And actually that makes easier to add a
function in the future for call stack preservation.

 IMO, the biggest problem is `finally'. I have doubts whether it's
surmountable at all, at least with a form like `-' (as opposed to
something like `yield').

 2. I am opposed to the notion of appending the continuation resume
 function on to the list of arguments.

 I pretty much agree with this point. But the bigger fish to fry is the
basic control flow story.

Cool, then I think we are mostly in agreement on how - could work.

 It still might be possible to sketch a simple semantics based on a
`yield'-like construct. But I wouldn't be surprised if it ended up
looking a lot like JS 1.7 / Python generators.

If that approach is still of interest, I'll throw out some ideas on
the other thread.

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkvKgm0ACgkQ9VpNnHc4zAwyYgCgo85mlMR6fMgIHYUmnz5NLnEY
jq4AoLKjifUrRmVmn0cTO00ajdvH1Gdl
=UCau
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Single frame continuations proposal

2010-04-14 Thread Kris Zyp
) body ~~ function f(x1, ..., xn) { return
 new Generator(function () { (function f(x1, ..., xn)
 body).call(this, x1, ..., xn); }); }

 I imagine it should be pretty easy to adapt this translation to
 work with your semantics instead, but for my sake it'd be more
 helpful to understand your semantics better first.

 Dave

 var [StopIteration, Generator] =

 (function() {

 var NEWBORN = 0; var DORMANT = 1; var RUNNING = 2; var CLOSED = 3;

 var StopIteration = {};

 function Yield(x) { this.value = x; } function Send(x) { this.sent
 = x; } function Throw(x) { this.thrown = x; }
Send and Throw are no longer needed.

 // [[function f(x1, ..., xn) body]] = // function f(x1, ..., xn) {
 // return new Generator(function() { // (function f(x1,
 ..., xn) body).call(this, x1, ..., xn); // }); // } function
 Generator(f) { this.state = NEWBORN; this.suspended = function(x)
 { if (x !== void(0)) throw new TypeError(newborn generator); //
 no need to bind `this' (will be this generator) return
 f.call(this); }; };

 Generator.prototype = {

 // [[yield e]] = this.receive(this-yield(e)) yield: function(x, k)
 { if ((this.state !== NEWBORN)  (this.state !== RUNNING)) throw
 yield from dormant or dead generator; this.state = DORMANT;
 this.suspended = k; return new Yield(x); },

 receive: function(x) { if (x instanceof Send) return x.sent; else
 if (x instanceof Throw) throw x.thrown; },

 sendOrThrow: function(x) { switch (this.state) { case RUNNING:
 throw already running; case CLOSED:  throw StopIteration;
 default: this.state = RUNNING; var result =
 this.suspended.call(this, x);

 // generator yielded if (result instanceof Yield) return
 result.value; // generator returned else { this.state = CLOSED;
 throw StopIteration; } } },

 send: function(x) { return this.sendOrThrow(new Send(x));
change to:

send: function(x) {
   return this.sendOrThrow(function(){

   return x;
   });
}
 },

 next: function() { return this.send(void(0)); },

 throw: function(x) { return this.sendOrThrow(new Throw(x));
change to:

throw: function(x) {
   return this.sendOrThrow(function(){
   throw x;
   });

}
 },

 close: function() { if (this.state === RUNNING) throw already
 running; this.state = CLOSED; this.suspended = null; }

 };

 return [StopIteration, Generator]; })();


Or stated in terms of translated code, if we had something like:
bar(foo-(2));
with your semantics would be roughly similar to:
foo(2, function(value){
  bar(value);
});
but with this change, I would suggest it should be translated:
foo(2, function(getValue){
  bar(getValue());
});


This change enable throwing or normal values with minimal effort. It
would also help with providing a means for users to preserve call
stacks. There would actually need to be more one more mechanism to
really do call stack preservation with this approach, the callback
function that is executed on reentry to the continuation really would
need a way to be able to refrain from resuming the continuation.
Perhaps a special exception or return value could be provided for this.


2. I am opposed to the notion of appending the continuation resume
function on to the list of arguments. Functions arguments can be
variable in length, so the callee would never know which position the
continuation function will be in, and must always look at the
arguments.length to determine that. In your example, if someone were
to directly use the generator library, calling this.yield-() or
this.yield-(1,2) would cause the function to be in a different place
than the library expected. Making it the first argument is no better,
since it shifts all the other arguments. If we are going to be passing
the continuation function to the yielding callee function, lets drop
the whole notion of trying to make it look like a call, and have the
yielding call operator take a single value for the right operand
rather than an argument list. Then, we could also drop the parenthesis
and make it look a little nicer as well. Your translated generator to
harmony code would then look like:
this.yield- expr
Now the yield function can deterministically expect to receive a value
for the first argument, and the continuation function for the next
argument, and we have a cleaner syntax that avoids the woes of
expecting it act like a normal function call that Waldemar noted.

Thanks,
- -- 

Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkvF6+sACgkQ9VpNnHc4zAzGnACguL509xeSRwqnx5VY3eqpy79U
0NoAn18jINco1OWGbLXhv9F8SKCFao/C
=nGuM
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Single frame continuations proposal

2010-04-05 Thread Kris Zyp
 value.continueWith === function){
var waiting = true;
return value.continueWith(function(resume){
   if(!waiting){
  throw new Error(can't resume stack);
   }
   waiting = false;
   return handleResult(resume());
});
  }
  else{
return handleResult(controller.resume(function(){
  return value;
}));
  }
 
  }
  return handleResult(controller.resume());

}

And the library functions defined above (Continuation, sleep, and
SimpleXHR) would work with this startCoroutine implementation and
translation.


 Do you prefer basing single-frame continuations on new non-latin
 character syntax instead of using the yield keyword (hadn't
 realized it was already reserved in ES5 strict-mode when I did
 the first proposal)?

 I don't follow you. Non-latin?

 Dave



I just meant yield vs -().

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAku6LbEACgkQ9VpNnHc4zAzEBgCeM+TI39RYjLduBx4RTDaVcfMx
UfMAn1SDT3GWdKtxl43xBQkX46nq1neC
=Qwpo
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Single frame continuations proposal

2010-04-04 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


On 4/1/2010 9:45 AM, Dave Herman wrote:
 I am not exactly sure what that intent is here, but I am guessing it
 is supposed to flash several times before showing the alert.

 No, sorry that I meant window.setInterval rather than
window.setTimeout, but otherwise I think I wrote what I meant. I wanted
it to alert done setup! immediately after installing the animation,
not after the animation finishes. Similarly, I wanted setFlashing to add
itself to an array immediately after setting up the animation. Don't
worry about only doing it 10 times, just let it flash forever. :)
In order to utilize leverage continuations with a function that
execute multiple we would need to eliminate single-shot restriction.
You could then create some library function that passed the
continuation to the setInterval function to do something like:
var toggle = true;
intervalling-(INTERVAL);
elt.style.background =...


But in this case, using the yielding call would be confusing and
provide very little benefit. This is definitely not the type of
scenario that it is designed for, normal anon functions work great
here. This is geared for when a callback is used to execute the
remainder of a sequence after the completion of an asynchronous that
we see so often in JavaScript.


 This really lead me to think about if it would be possible to meet
 the
 goals of this proposal *and* make this fully compatible (syntax and
 default behavior) with JS 1.7 generators. That is make it compatible
 with generators and still making it sufficiently extensible to meet
 the broad range of needs that could utilize single-frame
 continuations/coroutines. And I believe it is, and it could done in a
 way that this proposal could be implemented in JavaScript as well
 generators. Do you think it would be worth exploring that
 possibility?

 I actually have already sketched such a translation, albeit using the
semantics I proposed. I'm including it below. Essentially, you have to
translate *two* things: 1) the `yield' expression form, and 2) generator
functions. The latter is necessary because generator functions don't
start executing their function body until the first `send'. So the
translation is:

 yield expr ~~ this.receive(this-yield(expr))

 and

 function f(x1, ..., xn) body ~~
 function f(x1, ..., xn) {
 return new Generator(function () {
 (function f(x1, ..., xn) body).call(this, x1, ..., xn);
 });
 }

 Actually, I think this probably gets `this' wrong; I don't have the
time ATM to look up what the semantics of `this' is within a generator
body. But that should be pretty easily fixable.

 At any rate, this is relatively clean and I think's a plausibility
check that single-frame continuations should be simpler and more general
than generators, and likely compatible with them.

Yes, I believe that should work.

Do you prefer basing single-frame continuations on new non-latin
character syntax instead of using the yield keyword (hadn't realized
it was already reserved in ES5 strict-mode when I did the first proposal)?

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAku5UaIACgkQ9VpNnHc4zAz+WgCdFP+IO+vsvD+px3vPur90sSS/
3noAnR6TeYw+Q18IEB7BAXmQMPw8klVj
=Ydpt
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Single frame continuations proposal

2010-04-01 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


On 3/31/2010 10:56 AM, David Herman wrote:
 Hi Kris,

 I've been poring over this for a while, and it's still really,
 really confusing. Could I ask you to show how you would write the
 following example with your proposal?

 function setup() {
 setFlashing(document.getElementById(notificationArea));
 alert(done setup!); }

 var flashingElts = [];

 function setFlashing(elt) { var toggle = true;
 window.setTimeout(function() { elt.style.background = toggle ?
 red : white; toggle = !toggle; }, INTERVAL);
 flashingElts.push(elt); }

I am not exactly sure what that intent is here, but I am guessing it
is supposed to flash several times before showing the alert. So maybe
we could do it with something like:

function setup() {
setFlashing-(document.getElementById(notificationArea));
alert(done setup!); // called after 10 flashes
}

function setFlashing(elt) {
for(var i = 0; i  10; i++){ //toggle 10 times
var toggle = true;
sleep-(INTERVAL); // yield for each sleep
elt.style.background = toggle ? red : white;
toggle = !toggle;
}
}




 And then would you mind showing a rough translation of your
 implementation to ES5 code (similar to the translation of your
 `foo' function)?
However, before getting more into the translation and implementation
of a delay function, something you said earlier...

 Your approach seems nicely Harmony-ous: small, simple, orthogonal.
 It would definitely be a cost, however, if there turned out to be
 some incompatibility with JS 1.7 generators. I don't think there is
 one, but it would be good to know-- and conversely, it'd be awesome
 if generators (minus the syntax) could really be implemented as a
 library, e.g. if yield expr could be simulated by calling:

This really lead me to think about if it would be possible to meet the
goals of this proposal *and* make this fully compatible (syntax and
default behavior) with JS 1.7 generators. That is make it compatible
with generators and still making it sufficiently extensible to meet
the broad range of needs that could utilize single-frame
continuations/coroutines. And I believe it is, and it could done in a
way that this proposal could be implemented in JavaScript as well
generators. Do you think it would be worth exploring that possibility?
I'll put a sketch together in another email for this alternate
approach, (if it doesn't sound good, we can revert back to digging
into this one).

 6. Resume *continuation*, using the value of *result* for
 the continued evaluation of the current expression (if the
 continuation is inside an expression).

 Is this supposed to fall through?

 Yes

 That can't be true, can it? If step 6 continues to step 7 then we
 get:

 // step 5 if (!($result  typeof $result.continueWith ===
 function)) { // do step 6 } // fall through var $waiting = true;
 var $continueWithResult = $result.continueWith($resume);

 But we established in step 5 that there's no $result.continueWith
 function.


You are right, once the continuation is resumed the algorithm is
completed.

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAku0l4UACgkQ9VpNnHc4zAz51wCfWXvWOAti9fXV5BmMHsBOdjtW
kBMAmgOhD/8NaKWWqwnty/e9pCvFMJJ5
=49C0
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Single frame continuations using yield keyword with generators compatibility proposal

2010-04-01 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
David Herman's comment about JS1.7 generators spurred me to consider
if we could achieve the goals stated for single-frame continuations as
a broadly useful mechanism for replacing CPS code with more natural
flow for constructs like promises and one-time event handlers *and*
preserve the syntax of and compatibility with JS1.7 generators. At the
risk of creating confusion for a somewhat already complicated concept,
I wanted to put forth alternate proposal for single-frame
continuations. The tenants here are:

* Use of the yield keyword as the syntax addition for capturing
continuations
* Can default to behaving like JS1.7 generators
* Still maintain the principles of avoiding multi-frame stack
splitting with interleaving hazards and VM implementation burden
* Alternate continuation handlers can be used that control the
continuation of a function can be employed for other purposes. Unlike
JS1.7 generators, code would could have the ability to:
** Begin execution of the body of the function when it is called
** Recreate call stacks
** Yielding functions can exit through a normal return (using the
return operator to return a value).

The key idea of this approach is that when a function is called that
contains a yield operator, rather than following a hard-coded
prescription to return an generator/iterator object, this triggers a
call to the startCoroutine variable (from the current lexical scope)
that can implement various different behaviors, including immediately
executing the function and handling various values differently, or
returning an iterator. The startCoroutine function would be called
with a controller object that provides a resume function and a
suspended boolean property, that can be used by the startCoroutine
to resume execution, get the result of the function exiting through
yield and return operators and determine when the function is complete
(if it exited with a return). In addition the engine could define a
global startCoroutine variable such that returning iterators would be
the default behavior of a yielding functions, but this could easily be
overriden (globally or in the local scope).

This ultimately could be used a similar fashion as the other proposal,
but with slightly different syntax (use of yield instead -()) like:

startCoroutine = ... some promise style library's handler ...
showData = function(){
  var data = yield xhrGet({url:my-data}); // async
  if(data.additionalInformation){
// there is additional data we need to get, load it too
data.additionalData = yield
xhrGet({url:data.additionalInformation}); // async
  }
  someElement.innerHTML = template.process(data);
};


Here is the definition, adapted from some of David Herman's
corrections of my previous algorithm (hopefully I did a little better
this time):

semantics of a function that contains the yield keyword:
1. Define a *controller* object with a suspended property set to
true.
2. Set the resume property to be a function defined as *resume*.
3. Let *k* be the current function continuation (including code,
stack frame, and exception handlers).
4. Let *startCoroutine* be the result of looking up
startCoroutine variable starting in the current scope of the function
5. Call *startCoroutine* passing the *controller* object
6. Let *returned* be the value returned from the call to
startCoroutine.
7. Return *returned* from this function

semantics of calling *resume* with argument *v*:
1. Create a new function activation using the stack chain from the
point of capture.
2. Reinstall the function activation's exception handlers from the
point of capture (on top of the exception handlers of the rest of the
stack that has created *k*).
3. Call *v* with no arguments at the completion of the point of
capture.
4. Continue executing from the point of capture using the result
of the call to *v*.
5. If a yield operator is encountered proceed to the semantics of
yield expr.
6. If a return operator is encountered, set the *controller*
object's suspended property set to false.
7. Return the value of evaluated expression provided to the return
operator from the initiating call to *resume*.

semantics of yield expr:

1. Evaluate the expression and store the result as *result*.
2. Let *k* be the current function continuation (including code,
stack frame, and exception handlers).
3. Exit the current activation.
4. Return *result* from the initiating call to *resume*

An example of an equivalent translation, first a function that uses a
yield operator:
function foo(){
  return (yield bar()) + 2;
}

Would effectively act like the following ES5 code:
function foo(){
  var $pointer = 0; // used to keep track of where we are
  // create the controller
  var $controller = {
suspended: true,
resume: function(getValue){
  var nextValue;
  if(getValue){
nextValue = getValue();
  }
  

Single frame continuations proposal

2010-03-30 Thread Kris Zyp
 marker to indicate that it was a
yield call
yield = function(value){
// intended to be called with yielding syntax, like yield-(v);
yieldedValue = value;
return {
continueWith: function(resume){
yieldedCallback = resume;
return yieldReturn;
}
};
}
   
generator = function(func){
return function(){
var args = arguments,
self = this,
callback, throwing;
   
var executeNext = function(value){
if(typeof value !== undefined){
throw new TypeError(Can not pass a value on the
initial call to a generator);
}
// first time call the function
checkReturn(func.apply(self, args));
// all subsequent calls, we execute the next callback
provided by the yielding call
executeNext = function(value){
checkReturn(callback(function(){
// the value that is asynchronously returned by
yield-();
if(throwing){
throwing = false;
throw value;
}
return value;
}));
}
}
function send(value){
executeNext(value);
callback = yieldedCallback;
return yieldedValue;
}
function checkReturn(returnValue){
// checks to see if the function returned (instead of
calling yield)
if(yieldReturn !== returnValue){
throw StopIteration;
}
}
// return the iterator for this generator
return {
send: send,
next: function(){
return send();
},
throws: function(error){
throwing = true;
send(error);
}
};
};
};
if(typeof StopIteration === undefined){
StopIteration = {};
}
})();

- -- 
Thanks,
Kris


- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkuyX0wACgkQ9VpNnHc4zAz0YwCgihc5Ui7YxvTPp70YnHyls2jc
s+sAoKP6BMGnoyTh5S04hIwgAMcytEeh
=ekM1
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Single frame continuations proposal

2010-03-30 Thread Kris Zyp
 that by restricting the continuation to one-shot, it
would keep the proposal safer and more modest, and would retain the
normal invariant that statements within a function that are not in a
loop won't be executed more than once per function invocation, thus
not introducing any new coding hazards. If others don't see multi-shot
continuations as a hazard, I don't really mind if the one-shot
restriction is lifted.

2. Passing the continuation to the callee - As I mentioned before, I
think it is very important to maintain a separation of concerns
between the call arguments and continuation handling, and the call
arguments shouldn't be modified. I believe this presents a much more
consistent model to users. With my proposal, foo-(3) and foo(3) call
the foo function in exactly the same way, they only differ in how they
treat the function's return value. This also makes it possible to use
the yielding operator for sync/async normalization, allowing one to
use yielding call on existing synchronous functions to prepare for the
possibility of a future asynchronous implementation. One can write
code that is explicitly asynchronous ready without coupling to the
sync vs async implementation of the callees. For example, if one
implemented an array list object that could potentially support
asynchronous implementations, one might write:

arrayish.push-(3)

With my implementation, this works fine with a standard push
implementation. If we modify the arguments list and add the
continuation, this call would result in the addition of two elements
to the array (3 and the continuation). Furthermore, the continuation
won't even be resumed, because a standard push() wouldn't call the
callback.

3. Calling the callback/continuation (*k*/*resume*) function with the
value returned from the callee's execution of it's continuation
(instead of calling a function that executes the callee's
continuation) - The primary drawback to this change is that it
eliminates the call stack. With your approach the resumed execution
basically executes in trampolining fashion. My belief is that
retaining a standard call stack is invaluable for debuggability, and
this is retained with the original algorithm.

Also, it is worth pointing out that your simplification is
insufficient to propagate errors to callers. One would need to add a
second callback function that could be called to signal an exception
to be thrown (at least in the case of the yielding operator being
called within a try block). And with this correction, this change
isn't really much simpler, I don't think.


Also, just to be clarify the behavior of NarrativeJS, it uses a much
different call procedure than either of our algorithms (which I don't
think is really suitable for a public API). Functions are called with
an extra argument that is special frame object (I don't think you can
call it directly), and the called function can return a normal value
(hence supporting calling synchronous functions that can ignore the
extra frame object parameter and properly continuing) or a special
suspension value. It also recreates call stacks to preserve
debuggability.

Anyway, thanks for the corrections and compelling suggestions.

- -- 
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
 
iEYEARECAAYFAkuyuMUACgkQ9VpNnHc4zAzC0gCeMY8c4vCxOcM0+ZNTCHLi6JDR
g0gAn2bga0fTcwR2M96piZ4D3loVM19b
=mJFD
-END PGP SIGNATURE-

___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Revisiting Decimal

2009-01-15 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
Unfortunately, I don't have enough time to continue to point by point
discussion. If the group feels typeof 1.1m - decimal, then so be
it, we can certainly handle that. My point was to show empirical
evidence that could hopefully be considered in the decision process.

As far as JSON goes, Dojo will encode decimals to numbers, there is
really no coherent alternative (encoding to strings would be even more
bizarre, and I can't think of any other option).

Kris

Brendan Eich wrote:
 On Jan 14, 2009, at 9:32 PM, Kris Zyp wrote:

 Of course, there is no decimal support in ES3, there is no other
 option.

 This is not strictly true:

 http://code.google.com/p/gwt-math/source/browse/trunk/gwt-math/js_originals/bigdecimal.js



 The point is that JSON peers that do math on numbers, to
 interoperate in general, need to parse and stringify to the same
 number type. It may be ok if only ints that fit in a double are
 used by a particular application or widget, but the syntax allows
 for fraction and exponent, which begs representation-type precision
 and radix questions.

 One of the major incentives for JSON is that it is
 interoperability between languages. If other implementations in
 other languages treat JSON's number as decimal than the assertion
 that I understood you were making that JSON number's are being
 universally expected to be treated as binary is not true.

 It's probably a mix, with application-dependent restrictions on
 domain and/or computation so that using either double or decimal
 works, or else buggy lack of such restrictions.


 JSON's numbers are decimal, languages that support decimals
 agree. Dojo _will_ convert JS decimal's to JSON numbers
 regardless of what path ES-Harmony takes with typeof, whether
 it requires a code change or not.

 That will break interoperatability between current
 implementations that use doubles not decimals.

 How so? And how did all the implementations that use decimals to
 interpret JSON numbers not break interoperability?

 Not necessarily. But correctness is not a matter of hopes or
 percentages. It may be fine for JSON to leave it to the app to
 choose number type and/or operations done on the data. But  some
 layer has to care. Some apps probably already depend on json2.js
 and json.js and the like (ES3.1's JSON built-in) using double, not
 decimal. Changing a future JSON codec to use decimal instead of
 double is not a backward-compatible change.


 So you are suggesting that we shouldn't let users pass mix of
 decimals and numbers even if they explicitly attempt to do so?

 No, I'm suggesting unintended mixed-mode bugs will be common if we
 make typeof 1.1m == number.


 It's not beside my point. If signficantly more real world code
 will break due to violating the expected invariant of a constant
 finite set of typeof results (and the expectation that numbers
 regardless of precision will be typeof - number) than those
 that break due to violating the expected invariant of typeof x ==
 typeof x = (x == y = x === y)

 We can't measure this, realistically, but again: the breakage from
 a new typeof result is not dependent on the numeric value of the
 operand, and entails either a missing case, or a possibly
 insufficient default case, while the breakage from your proposal is
  subtly data-dependent.

 Plus, the invariant (while not holy writ) is an important property
 of JS to conserve, all else equal.


 than I think we would be negligent as language designers to
 ignore that consideration.

 It's not a consideration if it can't be quantified, and if it
 introduces value-dependent numeric bugs. Decimal and double are
 different enough that typeof should tell the truth. 1.1m != 1.1,
 1.2m != 1.2, but 1.5m == 1.5.


 I understand the logical concerns, but I would love to see real
 empirical evidence that contradicts my suspicions.

 I gave some already, you didn't reply. Here's one, about
 dojotoolkit/dojo/parser.js:

 But if typeof 1.1m == number, then str2obj around line 52 might
 incorrectly call Number on a decimal string literal that does not
 convert to double (which Number must do, for backward
 compatibility), 

 It won't do to assume your proposal saves effort and demand me to
 prove you wrong. First, no one has access to all the extant typeof
 x == number code to do the analysis and prove the majority of
 such code would work with your proposal. This is akin to proving
 a negative. Second, I've given evidence based on Dojo that shows
 incompatibility if typeof 1.1m == number.

 How about we talk about an alternative: use decimal as a way to
 make all literals, operators, and built-ins decimal never double?

 The problem with this big red switch is that it requires
 conversion from outside the lexical scope in which the pragma is
 enabled, since code outside could easily pass double data into
 functions or variables in the pragma's scope. It requires a
 decimal-based suite of Math, etc., built

Re: Revisiting Decimal

2009-01-15 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


Bob Ippolito wrote:
 On Thu, Jan 15, 2009 at 5:49 AM, Kris Zyp k...@sitepen.com wrote:

 -BEGIN PGP SIGNED MESSAGE- Hash: SHA1



 Bob Ippolito wrote:
 On Wed, Jan 14, 2009 at 9:32 PM, Kris Zyp k...@sitepen.com
 wrote:

 -BEGIN PGP SIGNED MESSAGE- Hash: SHA1



 Brendan Eich wrote:
 On Jan 14, 2009, at 7:38 PM, Kris Zyp wrote:
 You need to change this in any case, since even
 though the JSON
 RFC allows arbitrary precision decimal literals,
 real-world decoders only decode into IEEE doubles.
 You'd have to encode decimals as strings and decode
 them using domain-specific (JSON schema based) type
 knowledge.
 No, every Java JSON library I have seen

 You've seen http://www.json.org/json2.js It and the
 json.js alternative JS implementation are popular.
 json2.js contains String.prototype.toJSON =
 Number.prototype.toJSON = Boolean.prototype.toJSON =
 function (key) { return this.valueOf(); };
 Of course, there is no decimal support in ES3, there is no
 other option.
 parses (at least some, if not all) numbers to Java's
 BigDecimal.

 JSON has nothing to do wth Java, and most implementations
 do not have Java BigDecimal, so I don't know how it can
 be relevant.
 One of the major incentives for JSON is that it is
 interoperability between languages. If other implementations
 in other languages treat JSON's number as decimal than the
 assertion that I understood you were making that JSON
 number's are being universally expected to be treated as
 binary is not true.
 JSON's numbers are decimal, languages that support decimals
  agree. Dojo _will_ convert JS decimal's to JSON numbers
 regardless of what path ES-Harmony takes with typeof,
 whether it requires a code change or not.

 That will break interoperatability between current
 implementations that use doubles not decimals.
 How so? And how did all the implementations that use decimals
 to interpret JSON numbers not break interoperability?
 I'm pretty sure that interoperability is broken when they do
 this, it's just very subtle and hard to debug. I have the same
 stance as Brendan here, I've even refused to implement the
 capability to directly encode decimal as JSON numbers in my
 simplejson package (the de facto json for Python). If a user of
 the library controls both ends of the wire, they can just as
 easily use strings to represent decimals and work with them
 exactly how they expect on both ends of the wire regardless of
 what their JSON implementation happens to do.

 Imagine the person at the other end of the wire is using
 something like JavaScript or PHP. If the message contains
 decimals as JSON numbers they can not accurately encode or
 decode those messages unless they write their own custom JSON
 implementation. How do they even KNOW if the document is
 supposed to have decimal precision? What if the other end
 passes too many digits (often the case if one side is actually
 using doubles)? If they are passed around as strings then
 everyone can use the document just fine without any
 compatibility issues. The lack of a de jure number precision
 and the lack of a date/datetime type are definitely my biggest
 grievances with the JSON spec.
 Specifying number representations would be far more grievous in
 terms of creating tight-couplings with JSON data. It is essential
 that implementations are free to use whatever number
 representation they desire in order to facilitate a loose coupled
 interchange.


 For decimals, I definitely disagree here. In languages that support
  both float and decimal, it's confusing at best. You can only
 decode as one or the other, and if you try and do any math
 afterwards with the wrong type it will explode. In Python's case
 anyway, you can't even convert a float directly to a decimal
 without explicitly going through string first. simplejson raises an
 exception when you try and encode a decimal unless you tell it
 differently, it makes you decide how they should get represented.

 In simplejson it's trivial to transcode decimal to float (or string
 or anything else) during encoding, or to get all numbers back as
 decimal... but you have to do it explicitly. Loosely coupled
 doesn't have to mean lossy.

 -bob

Where is the loss coming from? JSON isn't doing any computations or
coercions, and ES would only be experiencing a loss when serializing
binary floats to JSON, but not with decimals. Decoders should be
allowed to be explicit and have control over how they choose to
internally represent the numbers they receive from JSON. Decimals in
string format doesn't change that fact, and is far more confusing.
Kris


- --
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAklvdr4ACgkQ9VpNnHc4zAyNngCcD7fU1kuQIaQAugtjZZQL7a7X
3lQAnj1RnvhEYFNmtatdmeVN5tBlxuVk
=XS7F
-END PGP SIGNATURE

Re: obsoleting the new keyword

2009-01-14 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
I certainly hope not. A fundamental requirement of class sugar should
be that it properly leverages new.
Kris

Peter Michaux wrote:
 The requirement that JavaScript needed to look like Java has long
 been lamented. One of the key looks was the new keyword.  Many
 people don't like the use of the new keyword. Although new is
 here to stay, could we obsolete it when using a class sugar?

 Peter ___ Es-discuss
 mailing list Es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss


- --
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAkluHzcACgkQ9VpNnHc4zAxxjACcCGf4zgq0nBPboMntRgKZyz/u
XNkAn12yb6ZX91jMgYaHZlA0Hsqv/JS4
=g37Z
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Revisiting Decimal

2009-01-14 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


Brendan Eich wrote:
 On Jan 9, 2009, at 3:08 PM, Kris Zyp wrote:

 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1


 The counter-argument is strong:

 typeof x == typeof y = (x == y = x === y)

 but 1.1 != 1.1m for fundamental reasons.
 I understand the counter-argument, but with such an overwhelming
 number of typeof uses having far easier migration with number,

 Migration how? You'll have to change something to use decimal or
 s/1.1/1.1m/. Only once you do that can you be sure about all
 operands being decimal.

And I am sure our users will do that and pass decimals into our
library functions.
 I'm assuming it would be bad in the Dojo code you've looked at if
 1.1 came in from some standard library that returns doubles, and was
 tested against 1.1m via == or === with false result, where previous
 to decimal being added, the result would be true.
I am not aware of any situations in the Dojo codebase where this would
cause a problem. I can't think of any place where we use an
equivalence test and users would expect that decimal behave in the
same way as a double. Do you have any expected pitfalls that I could
look for in Dojo?



 I can't possibly see how the desire to preserve this property is more
 important than better usability for the majority use cases.

 You really need to show some of these use cases from Dojo. I have a
 hard time believing you've ruled out mixed-mode accidents.
Ok, sounds good, I will be glad to be corrected if I misunderstanding
this. Here are some of the places where I believe we would probably
add extra code to handle the case of typeof checks where decimal
values may have been passed in by users, and we would want the
behavior to be the same as a number:
As I have mentioned before, we would need to change our JSON
serializer to handle decimal:
http://archive.dojotoolkit.org/nightly/dojotoolkit/dojo/_base/json.js
(line 118)
Our parser function would need to add support for decimal
http://archive.dojotoolkit.org/nightly/dojotoolkit/dojo/parser.js
(line 32)
Matrix math handling for our graphics module:
http://archive.dojotoolkit.org/nightly/dojotoolkit/dojox/gfx/matrix.js
(line 88 is one example)
Actually there are numerous situations in the graphics packages where
a decimal should be acceptable for defining coordinates, scaling, etc.:
http://archive.dojotoolkit.org/nightly/dojotoolkit/dojox/gfx/
Charting also has a number of places where decimals should be an
acceptable form of a number:
http://archive.dojotoolkit.org/nightly/dojotoolkit/dojox/charting/
For example:
http://archive.dojotoolkit.org/nightly/dojotoolkit/dojox/charting/action2d/Magnify.js
(line 22)

Again, I understand there are difficulties with typeof 1.1m returning
number, but in practice it seems we would experience far more pain
with decimal.

- --
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAkluhukACgkQ9VpNnHc4zAyaLgCeLbJeVvoLd1ypvK9uiyfO0Jhw
RuEAoKNZQeBKKfHzoupEdY+Nv16Lk+ch
=pV7U
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Revisiting Decimal

2009-01-14 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


 You need to change this in any case, since even though the JSON
 RFC allows arbitrary precision decimal literals, real-world
 decoders only decode into IEEE doubles. You'd have to encode
 decimals as strings and decode them using domain-specific (JSON
 schema based) type knowledge.
No, every Java JSON library I have seen parses (at least some, if not
all) numbers to Java's BigDecimal. JSON's numbers are decimal,
languages that support decimals agree. Dojo _will_ convert JS
decimal's to JSON numbers regardless of what path ES-Harmony takes
with typeof, whether it requires a code change or not.

 Our parser function would need to add support for decimal
 http://archive.dojotoolkit.org/nightly/dojotoolkit/dojo/parser.js
 (line 32)

 You're right, this parser would need to be extended. But if
 typeof
 1.1m == number, then str2ob around line 52 might incorrectly call
  Number on a decimal string literal that does not convert to double
  (which Number must do, for backward compatibility), or else return
 a double NaN (not the same as a decimal NaN, although it's hard to
 tell -- maybe impossible?).

 It seems to me you are assuming that decimal and double convert
 to
 and from string equivalently. This is false.


 Actually there are numerous situations in the graphics packages
 where a decimal should be acceptable for defining coordinates,
 scaling, etc.:
 http://archive.dojotoolkit.org/nightly/dojotoolkit/dojox/gfx/

 Only if never compared to a double. How do you prevent this?
We already agree that the decimal-double comparison will always be
false. The point is that this is representative of real world code
that benefits more from the treatment of decimals as numbers.



 Charting also has a number of places where decimals should be an
 acceptable form of a number:
 http://archive.dojotoolkit.org/nightly/dojotoolkit/dojox/charting/
 For example:
 http://archive.dojotoolkit.org/nightly/dojotoolkit/dojox/charting/action2d/Magnify.js
  (line 22)

 I will look at these later as time allows, pending replies on
 above points.


 Again, I understand there are difficulties with typeof 1.1m
 returning number, but in practice it seems we would experience
 far more pain with decimal.

 Trouble for you Dojo maintainers but savings for users. You may
 have to do a bit more work to avoid imposing bugs on your users.
 That's life in the big Dojo city.
If that's true, that's fine, I have no problem with Dojo feeling the
pain for the sake of others, but I still find it very surprising that
Dojo code would be so misrepresentative of real code out there today.
Dojo covers a very broad swath of topics. Do you really think real
world JS is that much different than Dojo's?
Kris



 /be





- --
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAklur9AACgkQ9VpNnHc4zAzlxwCgpKOVIUfUvIZpdYGOOTC3c2vp
LDIAmgLvpzAW8500idQvyTFaXQ4+eRPv
=cn6y
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Revisiting Decimal

2009-01-14 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


Brendan Eich wrote:
 On Jan 14, 2009, at 7:38 PM, Kris Zyp wrote:


 You need to change this in any case, since even though the
 JSON
 RFC allows arbitrary precision decimal literals, real-world
 decoders only decode into IEEE doubles. You'd have to encode
 decimals as strings and decode them using domain-specific
 (JSON schema based) type knowledge.
 No, every Java JSON library I have seen

 You've seen

 http://www.json.org/json2.js

 It and the json.js alternative JS implementation are popular.
 json2.js contains

 String.prototype.toJSON = Number.prototype.toJSON =
 Boolean.prototype.toJSON = function (key) { return
 this.valueOf(); };


Of course, there is no decimal support in ES3, there is no other option.
 parses (at least some, if not all) numbers to Java's BigDecimal.

 JSON has nothing to do wth Java, and most implementations do not
 have Java BigDecimal, so I don't know how it can be relevant.

One of the major incentives for JSON is that it is interoperability
between languages. If other implementations in other languages treat
JSON's number as decimal than the assertion that I understood you were
making that JSON number's are being universally expected to be treated
as binary is not true.
 JSON's numbers are decimal, languages that support decimals agree.
 Dojo _will_ convert JS decimal's to JSON numbers regardless of what
 path ES-Harmony takes with typeof, whether it requires a code
 change or not.

 That will break interoperatability between current
 implementations that use doubles not decimals.

How so? And how did all the implementations that use decimals to
interpret JSON numbers not break interoperability?

 It's not a question of more or less. If you let decimals and
 numbers mix, you'll get data-dependent, hard to diagnose bugs. If
 you do not, then you won't (and Dojo maintainers will have to
 work a bit to extend their code to handle decimals -- which is
 the right trade. Recall Mr. Spock's dying words from STII:TWoK
 :-).

So you are suggesting that we shouldn't let users pass mix of decimals
and numbers even if they explicitly attempt to do so?


 If that's true, that's fine, I have no problem with Dojo feeling
 the pain for the sake of others, but I still find it very
 surprising that Dojo code would be so misrepresentative of real
 code out there today.

 It's not necessarily representative. It's not necessarily
 mis-representative. But we need to agree on how decimal as
 proposed compares to number (double) first, since from what you
 wrote above I see misunderstanding.


 Dojo covers a very broad swath of topics. Do you really think real
 world JS is that much different than Dojo's?

 I have no idea, but this is completely beside the point. Breaking
  typeof x == typeof x = (x == y = x === y) for decimal will
 break existing code in data-dependent, hard to diagnose ways.

 Adding a new typeof code will not depend on the value of a given
 decimal: any decimal will cause control to fall into an else,
 default, or unhandled case, which is strictly easier to debug and
  fix. Plus, any future JS standard with Decimal will be a big
 enough deal that porting will be obligatory and understood, by
 the time browsers adopt decimal.
It's not beside my point. If signficantly more real world code will
break due to violating the expected invariant of a constant finite set
of typeof results (and the expectation that numbers regardless of
precision will be typeof - number) than those that break due to
violating the expected invariant of typeof x == typeof x = (x == y
= x === y) than I think we would be negligent as language designers
to ignore that consideration. I understand the logical concerns, but I
would love to see real empirical evidence that contradicts my suspicions.

Kris
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAkluynkACgkQ9VpNnHc4zAx5LgCfWWzZ7s2gGDz0OMS6QrjOMbYy
VMIAoLWc9d6ZUqVmY/ma2PygBCXdNgK2
=oUop
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Revisiting Decimal

2009-01-09 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
What is the current state of the result of typeof on decimals, was
there consensus on this? I hope we will be using typeof 1.1m -
number. For a little bit of emperical evidence, I went through
Dojo's codebase and their are numerous places that we would probably
want to alter our code to include additional checks for decimal if
typeof 1.1m - decimal, whereas if number we would probably leave
virtually everything intact in regards to number handling with
consideration for decimals.
Thanks,
Kris

Brendan Eich wrote:
 Sam's mail cited below has gone without a reply for over a month.
 Decimal is surely not a high priority, but this message deserves
 some kind of response or we'll have to reconstruct the state of the
  argument later, at probably higher cost.

 I was not at the Redmond meeting, but I would like to take Sam's
 word that the cohort/toString issue was settled there. I heard
 from Rob Sayre something to this effect.

 But in case we don't have consensus, could any of you guys state
 the problem for the benefit of everyone on this list? Sorry if this
  seems redundant. It will help, I'm convinced (compared to no
 responses and likely differing views of what the problem is, or
 what the consensus was, followed months later by even more painful
 reconstruction of the state of the argument).

 The wrapper vs. primitive issue remains, I believe everyone agrees.


 /be

 On Dec 4, 2008, at 2:22 PM, Sam Ruby wrote:

 2008/12/4 Brendan Eich bren...@mozilla.com:

 Sam pointed that out too, and directed everyone to his
 test-implementation results page:
 http://intertwingly.net/stories/2008/09/20/estest.html Indeed
 we still have an open issue there ignoring the wrapper one:

 [Sam wrote:] I think the only major outstanding semantic issue
 was wrapper objects; apart from that, the devil was in the
 detail of spec wording.[End Sam]

 No, the cohort/toString issue remains too (at least).

 With a longer schedule, I would like to revisit that; but as of
 Redmond, we had consensus on what that would look like in the
 context of a 3.1 edition.

 From where I sit, I find myself in the frankly surreal position
 that we are in early December, and there are no known issues of
 consensus, though I respect that David-Sarah claims that there is
 one on wrappers, and I await his providing of more detail.

 /be

 - Sam Ruby

 ___ Es-discuss mailing
 list Es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss


- --
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAkln1acACgkQ9VpNnHc4zAzSOwCbBbcYMmHxg2emCBgjrca9ZDjq
3M4An16zI6nUjssjQ/q3ecnH84aomA5K
=nbmt
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Revisiting Decimal

2009-01-09 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 


 The counter-argument is strong:

 typeof x == typeof y = (x == y = x === y)

 but 1.1 != 1.1m for fundamental reasons.
I understand the counter-argument, but with such an overwhelming
number of typeof uses having far easier migration with number, I
can't possibly see how the desire to preserve this property is more
important than better usability for the majority use cases. Do you
think other libraries and JS code are that vastly different than Dojo?
Thanks,
Kris

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAkln2QcACgkQ9VpNnHc4zAwasgCfbRvlyhoUlNuWSRUKyNeTyWzh
B0IAoIh59kZflQy9A8Re9KpVUrNLQj/A
=PnfR
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


strawman: Harmonious Classes and Typing

2008-11-26 Thread Kris Zyp
 on the web) and test
cycle that is so slow.

I could also include a JavaScript/psuedocode implementation to provide
a more detailed description of the mechanics of this class system.

Thanks,
Kris



Kris Zyp wrote:
 

 - Original Message -
 *From:* Kris Zyp mailto:[EMAIL PROTECTED]
 *To:* es4-discuss Discuss mailto:[EMAIL PROTECTED] ;
 [EMAIL PROTECTED] mailto:[EMAIL PROTECTED]
 *Sent:* Friday, July 25, 2008 9:01 AM
 *Subject:* Typing with schemas instead of annotations

 I wanted to propose an alternate approach to defining types
 within ES. I don't think is actually a realistic proposal for
 changing ES4 or ES3.1, but more of as an interesting alternate
 language mechanism for utilizing the ES4 VM with ES3 syntax that
 I have been interested in exploring for a while now, and wanting
 to write out in case anyone was interested in the idea. The
 basic premise is to define Classes, records, and parameter types
 with a schema object that can easily defined with ES3/JSON,
 rather than using ES4 type annotation syntax. Type information
 would be defined with a schema that would act like an interface,
 and this could be used in combination with local type inference
 for local variables. My examples and proposal are based on using
 JSON schema[1] for defining types; my involvement in JSON Schema
 might preclude objectivity to some degree, but I do think it is
 the most reasonable ES3 compatible definition for types.
 Expressed in ES3, it is a simple object structure, with a high
 degree of correspondence to ES4 types. However, though the
 proposal is more about using a external type definition/schema
 even if an alternate format (like Cerny [2]) might be better.
 
 This approach could have application even without actual native
 support. Development could potentially use this approach to have
 a codebase that could easily be adapted (possibly through
 automated transformation, or type omission) to various target ES
 VMs. Also this approach has nothing to do with classes as
 sugar proposal [3]; it could actually be considered the
 inverse. Rather than attempting ES4 class syntax with ES3(.1)
 semantics, it is about using ES3 syntax to drive ES4
 class/record semantics. This proposal is also not a complete in
 it's ability to define ES4 semantics with ES3 syntax. There are
 definitely plenty of typing forms that this approach can't
 define, but I would suggest that the majority of type
 definitions can be defined or inferred with this approach.
 
 Motivation
 
 1. The first motivation is that the code could run in ES3 or ES4
 VMs. Of course the ES3 VM doesn't have native typing support, so
 it would either have to go without typing, or do code translation.
 
 2. Separation of behavior and typing/structure concerns.
 a. One can look at the schema for the strucuture and
 interface of the a given Class or data structure separately from
 looking at the code for a nice clean, minimal (no annotations)
 look at the implementation and behavior.
 b. One can easily include or exclude typing information for
 code. Certainly the biggest benefits of the ES4 type system are
 at development time, with the integrity, correctness,
 organization, and code completion assistance. Once in
 production, the majority of web applications spend far more time
 downloading JavaScript than they do executing it (especially
 after DOM deductions). Applications may be becoming more
 sophisticated, but sophisticated apps still increase download
 time and with ES VMs quickly improving, plus hardware
 improvements increasing at a faster rate than bandwidth widens,
 I think download times will continue to dominant execution times
 for quite a while. Consequently, it seems most likely that
 performance-savvy developers will mostly like strip the majority
 of type annotations off of code during compression (perhaps
 retaining some annotations for performance sensitive hot-spots
 if VMs prove to benefit from typing information, or retaining it
 in situations where correct execution depends on type errors,
 rather than only signalling incorrect execution).
 c. Existing applications could be retrofitted with type
 information without modifying the original code (or minimally
 modifying). Since type information is provided through a
 separate schema, the original code can be kept intact.
 
 3. Class reflection has an obvious reification based on the
 schema. Using JSON Schema, type information is reflected as the
 schema object itself.
 
 4. Language neutral interfaces - JSON Schema has been designed
 to be a language agnostic (I realize that might be a little
 wistful, JSON Schema bears influence from the primitives of JSON

Re: strawman:names

2008-11-19 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
Arg, I responded to wrong email :/, I meant to reply to the one on
catchalls, that is the one I think is a very valuable feature.
Kris

Kris Zyp wrote:
 +1, I think this was one of the most valuable features from ES4.
 Kris

 Peter Michaux wrote:
 The strawman:names wiki page

 http://wiki.ecmascript.org/doku.php?id=strawman:names

 starts with

 It is not possible to create hidden properties in an object when
  property names can only be strings.

 Why can't a string-named property be private? I understand they
 aren't now but why couldn't the private keyword make a
 string-named property private?

 

 At the bottom of the same page

 Should we allow new Name('blah') for better printed
 representation? (Similar to (gensym 'blah) in Scheme)

 If there is a new Name constructor as mentioned higher in the
 wiki page, then why wouldn't new Name('blah') be allowed?

 Since (gensym) returns a symbol, the above quotation makes it
 seem that the proposed Name objects are like Scheme symbols,
 correct? If this is the case, why not just call them Symbol
 objects? There is precedence for symbol in many languages and
 symbol is far less overloaded in the programming world than
 name.

 Peter ___ Es-discuss
 mailing list Es-discuss@mozilla.org
 https://mail.mozilla.org/listinfo/es-discuss



___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


- --
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAkkkcNUACgkQ9VpNnHc4zAxyYgCgrh9fY3yEcKHcGWAAwABSVOHr
PZMAoK++UXHNBD0qrAAbsJ0ETgpJTnOX
=TAOK
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: In what ways does the following eval regularity break?

2008-10-30 Thread Kris Zyp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
FWIW, a little empirical data:

 I think we should even consider banning local eval from strict
 mode.

 In light of the examples I gave above, and more that are pretty
 easy to find, I think this would make strict mode unused in
 practice.

In Dojo, eval is used with no need for local scope about 20 times, and
about 3 times needing local scope. Interestingly enough all three
times eval used local scope, it was in code that I have written (and I
have written a pretty small percentage of Dojo)... I could have missed
other cases where eval needed local scope, but it looked all the other
cases just really wanted global scope. The 3 cases certainly benefited
from local scope, and I could describe them if desired. It would be
possible to rewrite the code to not use local scope in the eval, but
it would be a pain.
Kris
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAkkJuU4ACgkQ9VpNnHc4zAwxZQCgvZevkHtLgiIxUxcGp7Qz9lGA
DLMAn1UuD2NOIBU1rtYjKfs5oMkmmoJc
=rkn/
-END PGP SIGNATURE-

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Decimal comparisons

2008-09-19 Thread Kris Zyp
 +1 for typeof 1m === 'number'. As an example of breakage, I believe
 Crockford's current version of his JSON library would not do as I  would
 desire with decimals:

 JSON.stringify({foo:1m}) - {\foo\:undefined}

 Why is that worse than producing '{foo:1}'?

The fact that you know what I was expecting without me saying seems like 
good evidence.

 Consider 1.1m instead  of 1m.

Would be:
'{foo:1.1}'


 JSON does not provide for decimal, and receiver-makes-it-wrong is a  bug. 
 JSON would need to be extended to handle decimal reliably.

No, JSON only provides for decimal, JSON uses radix 10 to encode numbers, it 
has no support for binary. JSON doesn't need to be extended (unless you want 
to see binary added, but I don't that would be popular). JSON makes no 
assertions about what format a receiver must use to store the numbers it 
receives, it transfers numbers in decimal format, and implementations can 
and should be able to use any format they desire to internally represent it 
based on their needs and expectations. Today implementations do vary in 
which format they choose to use, but there is no restriction preventing 
implementations from using decimal formats for de-serialization of numbers, 
and it seems like it would make sense for JavaScript to do so when decimals 
become available.

Kris 

___
Es-discuss mailing list
Es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Look Ma, no this (was: ECMAScript Harmony)

2008-08-24 Thread Kris Zyp
I don't see why it is better to have classes be sugar for closured object 
literals rather than prototype based structures. It seems the benefits that 
are being sought after are auto-binding on method extraction and private 
method creation. Both of these can be attained with prototype-based class 
desugaring. If you really you want auto-binding on method extraction (which 
IMO is of very minimal value), why not do it this way and keep class 
semantics much closer to the prototype techniques used in JavaScript today:

(function(){
// desugared Point class
Point = function(){};
function privateMethod(){
}
Point.prototype = {
get getX(){
return function(){
return this.x;
}.bind(this);
},
get getY(){
...
}
get publicMethod(){
// example public method calling an example private method
return function(){
// sugared version would be this.privateMethod(1,2)
privateMethod.apply(this,1,2);
}.bind(this);
}
}
});

However, I would prefer |this| instanceof checking instead auto-binding on
extraction. Rather than binding on extraction, class methods would always 
have an implicit check to make sure they were called with an object that was 
an instance of (instanceof) the class:
Point = function(){};
Point.prototype = {
getX : function(){
if (!(this instanceof Point)){
throw new TypeError(Method called with incompatible object
instance);
}
return this.x;
},
...
};

By using type checking instead of binding on extraction, this would allow
functions that recieve a function/method as parameter to apply the function
to appropriate compatible object instance with apply or call:
function showCoordinateOnSelect(coordinateGetter){
table.onclick = function(event){
var point = getPointForClick(event);
alert(coordinateGetter.call(point));
}
}
...
showCoordinateOnSelect(Point.prototype.getX);

By using prototype based semantics for future class sugar, we can
retain the same semantics used by vast numbers of JavaScript libraries and
programmers themselves. VMs can just as easily apply clever optimizations
for these structures, and existing class constructs will have a smooth
migration to sugared syntax.

Thanks,
Kris

 On Tue, Aug 19, 2008 at 9:15 PM, Kris Zyp [EMAIL PROTECTED] wrote:
 Why do you freeze the functions? Is this just to cater to mutable
 function
  critics, or is there actually a reason tied class semantics?

 It is to cater to mutable function critics, since they're right ;).

That may be, but it seems like an orthogonal feature, seems like it should
be discussed separately to avoid confusing the goal of basic class
desugaring.

Thanks,
Kris

- Original Message - 
From: Mark S. Miller [EMAIL PROTECTED]
To: Peter Michaux [EMAIL PROTECTED]
Cc: Brendan Eich [EMAIL PROTECTED]; [EMAIL PROTECTED]; TC39
[EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Tuesday, August 19, 2008 6:41 PM
Subject: Look Ma, no this (was: ECMAScript Harmony)


 On Wed, Aug 13, 2008 at 7:15 PM, Peter Michaux [EMAIL PROTECTED]
 wrote:
 On Wed, Aug 13, 2008 at 2:26 PM, Brendan Eich [EMAIL PROTECTED]
 wrote:

 [snip]

 We talked about desugaring classes in some detail in Oslo. During
 these exchanges, we discussed several separable issues, including
 classes, inheritance, like patterns, and type annotations. I'll avoid
 writing more here,

 Is there more to read elsewhere? I'd like to know concretely what
 desugaring classes means.

 The main difference from the old Classes as Sugar proposal is to
 desugar to the objects-as-closure style pioneered by Crock rather than
 ES3-classical style of prototypical inheritance + this-binding.


 Point as a final root class:

 function Point(x, y) {
const self = Object.create(Point.prototype, {
toString: {value: Object.freeze(function() ('' + self.getX()
 + ',' + self.getY() + ''))},
enumerable: true},
getX: {value: Object.freeze(function() x),
enumerable: true},
getY: {value: Object.freeze(function() y),
enumerable: true}
}, true);
return self;
 }

 (Assuming that absent attributes default to false, which I don't think
 is currently the case in the ES3.1 draft.)

 If we stick with zero inheritance, which seemed attractive at Oslo, we
 can skip the part about inheritance below. Otherwise, read on.


 inheritance


 Point as a non-final non-abstract root/mixin class where toString is a
 final method:

 function PointMixin(self, x, y) {
Object.defineProperties(self, {
toString: {value: Object.freeze(function() ('' + self.getX()
 + ',' + self.getY() + ''))},
enumerable: true},
getX: {value: Object.freeze(function() x),
enumerable: true, flexible: true},
getY: {value: Object.freeze(function() y),
enumerable: true