Ok, I think I understand how your needs differ from the async/relaxed-await idea. You need a way to
- extract values from deeply nested yields, which is not allowed by await expressions, - resume execution whenever you choose, which is a decision await expressions make for you, i.e. they resume whenever the Promise is resolved/rejected, and - execute everything synchronously (if desired). Perhaps if there was a way to wrap any arbitrary expression with a generator that captured any yielded values and allowed resumption by calling .next(), then you could accomplish this without inventing new try-catch syntax? On Tue, Mar 15, 2016 at 8:27 PM Sebastian Markbåge <[email protected]> wrote: > async functions only address the async use case and they're all scheduled > on a simple micro-task queue. We need fine grained control over scheduling. > Perhaps Zones can help a bit with that but that's just one of severals > concepts that need this. > > It doesn't solve other more general generator use cases. You could > potentially expand it to generators as well. > > However, then you'd also need to solve the nested handlers case > efficiently. That's my use case. What if you have a layout handler, in an > iteration handler in an async scheduler handler? > > The async functions could be implemented in terms of this though since > they're just a more specific and locked down version. > > > > On Tue, Mar 15, 2016 at 5:16 PM, Ben Newman <[email protected]> wrote: > >> What if we simply allowed await expressions anywhere in the call stack of >> an async function, rather than only in the bodies of async functions? That >> would give us all the power of "yielding deep in a fiber" with a much more >> familiar syntax, and an easy way to capture effects, since an async >> function returns a Promise that allows for asynchronous error handling. No >> new catch effect … syntax needed. >> >> I've talked about this before >> <http://benjamn.github.io/goto2015-talk/#/17>, and it's my understanding >> that TC39 needs a lot of convincing that coroutines are a good idea. We >> haven't really discussed the topic in the context of async functions, which >> are soon to be an official part of the language, so perhaps the time for >> discussing coroutines/continuations/etc. is drawing near! >> >> On Tue, Mar 15, 2016 at 7:44 PM Sebastian Markbåge <[email protected]> >> wrote: >> >>> Has anyone previously proposed a Algebraic Effects (e.g. Eff like >>> handlers of continuations) for ECMAScript? #lazyweb I could only find >>> variations that aren't quite this. >>> >>> I'm specifically looking at the OCaml implementation for inspiration: >>> >>> http://kcsrk.info/ocaml/multicore/2015/05/20/effects-multicore/ >>> >>> http://kcsrk.info/slides/multicore_fb16.pdf >>> >>> I'm not proposing the multicore aspect of this but just the delimited >>> continuations. >>> >>> Basically, the idea is that you can capture effects by wrapping a call >>> in a "try". That spawns a fiber. Then any function can "perform" an effect >>> as a new language feature. That works effectively like "throw" expect it >>> also captures a reified continuation, in a tuple. The continuation can be >>> invoked to continue where you left off. >>> >>> Imaginary syntax: >>> >>> function otherFunction() { >>> console.log(1); >>> let a = perform { x: 1, y: 2 }; >>> console.log(a); >>> return a; >>> } >>> >>> do try { >>> let b = otherFunction(); >>> b + 1; >>> } catch effect -> [{ x, y }, continuation] { >>> console.log(2); >>> let c = continuation(x + y); >>> console.log(c); >>> } >>> >>> Prints: >>> 1 >>> 2 >>> 3 >>> 4 >>> >>> (`perform` is a contextual keyword to "throw" and `catch effect` is a >>> keyword for catching it.) >>> >>> We've experimented with changing React's implementation to use these >>> internally to handle concurrency and being able to solve complex algorithms >>> that require a lot of back and forth such as layout calculation. It seems >>> to make these implementations much easier while remaining efficient. >>> >>> It also allows for seamless async I/O handling by yielding deep in a >>> fiber. >>> >>> Effectively this is just solving the same thing as generator and >>> async/await. >>> >>> However, the benefit is that you don't have a split between "async" >>> functions, generator functions and synchronous functions. You still have an >>> explicit entry point through the place where you catch effects. >>> >>> With generators and async functions, anytime you want to change any deep >>> effects you have to unwind all potential callers. Any intermediate library >>> have to be turned into async functions. The refactoring is painful and >>> leaves you with a lot of syntax overhead. >>> >>> If you want to nest different effects such as layout, iterations and >>> async functions that complexity explodes because now every intermediate >>> function has to be able to handle all those concepts. >>> >>> The performance characteristics demonstrated by KC Sivaramakrishnan are >>> also much more promising than JS VMs has been able to do with async/await >>> and generators so far. It's plausible that VMs can optimize this in similar >>> way, in time. I suspect that the leakiness of the microtask queue might >>> cause problems though. >>> >>> I converted the OCaml example scheduler to this ECMAScript compatible >>> syntax: >>> >>> // RoundRobinScheduler.js >>> >>> class Fork { >>> constructor(fn) { >>> this.fn = fn; >>> } >>> } >>> function fork(f) { >>> perform new Fork(f) >>> } >>> >>> class Yield { } >>> function yieldToScheduler() { >>> perform new Yield(); >>> } >>> >>> export function run(main) { >>> const run_q = []; >>> function enqueue(k) { >>> run_q.push(k); >>> } >>> function dequeue() { >>> if (run_q.length) { >>> run_q.shift()(); >>> } >>> } >>> function spawn(f) { >>> try { >>> f(); >>> dequeue(); >>> } catch (e) { >>> console.log(e.toString()); >>> } catch effect Yield -> [_, k] { >>> enqueue(k); >>> dequeue(); >>> } catch effect Fork -> [fork, k] { >>> enqueue(k); >>> spawn(fork.fn); >>> } >>> } >>> spawn(main); >>> } >>> >>> // Example.js >>> >>> import * as Sched from "RoundRobinScheduler"; >>> >>> function f(id, depth) { >>> console.log("Starting number %i", id); >>> if (depth > 0) { >>> console.log("Forking number %i", id * 2 + 1); >>> Sched.fork(() => f(id * 2 + 1, depth - 1)); >>> console.log("Forking number %i", id * 2 + 2); >>> Sched.fork(() => f(id * 2 + 2, depth - 1)); >>> } else { >>> console.log("Yielding in number %i", id); >>> Sched.yield(); >>> console.log("Resumed number %i", id); >>> } >>> console.log("Finishing number %i", id); >>> } >>> >>> Sched.run(() => f(0, 2)); >>> >>> _______________________________________________ >>> es-discuss mailing list >>> [email protected] >>> https://mail.mozilla.org/listinfo/es-discuss >>> >> >
_______________________________________________ es-discuss mailing list [email protected] https://mail.mozilla.org/listinfo/es-discuss

