On Friday, September 1, 2017 at 6:29:50 PM UTC+2, Jakob Kummerow wrote:
>
> As Yang points out, microtasks are only run when the embedder asks us to, 
> and per spec this is only when the execution stack is otherwise empty. (The 
> exception to this is %RunMicroTasks() in mjsunit tests.)
> So we only really care about a C++ entry point.
>
> However, what RunMicrotasks does is loop over the queue and call into JS 
> for every iteration, because the entries in the queue are JS callables. 
> AFAIK going through the JSEntryStub has even more overhead than going 
> through the CEntryStub, so the idea here is to cross the language boundary 
> only once (into a CSA builtin), and then loop there. This is the mirror 
> image of another trick we pull more commonly: converting a JS builtin with 
> a loop that calls the runtime on every iteration with a single runtime call 
> that loops in C++.
>
> AFAICT the only existing technology we have for calling CSA-generated code 
> from C++ is by wrapping it into a JSFunction. The JSFunctions where we do 
> this so far, however, are all tied to a native context; whereas the 
> RunMicrotasks CSA builtin would be context-independent (just like its 
> existing C++ equivalent). When Adam and I looked at the source, it seemed 
> to us that creating a JSFunction with a nullptr context should work (at 
> least for getting through the JSEntryStub), and would allow us to reuse the 
> existing mechanism.
>
> If that idea doesn't fly, then do we have any options beyond mucking with 
> the JSEntry stub to make it possible to call CSA builtins directly?
>

I and Benedikt were actually thinking that mucking with the JSEntry stub 
would be lesser evil. All we should need to change in JSEntry would be to 
add a call to a stub that would check whether it needs to process the 
microtask queue and do it. (I guess this is similar to what Caitlin already 
did when she said "Finally, I tried adding a new variant to JSEntryStub, 
which call the RunMicrotasks stub rather than the various entry 
trampolines.") Yes, it does sound a bit scary, but the JSFunction with 
nullptr context seems scary + hacky.

Eventually, we might consider writing more general machinery for calling 
arbitrary CSA from C++, but I think we should only do it once there are 
more customers for this.
 

>
>
> On Fri, Sep 1, 2017 at 6:18 AM, Caitlin Potter <[email protected] 
> <javascript:>> wrote:
>
>>
>> On Sep 1, 2017, at 2:36 AM, Benedikt Meurer <[email protected] 
>> <javascript:>> wrote:
>>
>> I think we're trying to solve an actual problem, plus a problem that we 
>> might not have. Let's just keep the C++ version of RunMicrotaskQueue for 
>> now and work on the real issue only. This code doesn't change often and 
>> there's otherwise no advantage but a lot of potential breakage when messing 
>> with strong-rooted JSFunctions, JSEntryStub and friends.
>>
>>
>> Which is the "actual" problem, and which is the one we might not have?
>>
>> -- Benedikt
>>
>> On Fri, Sep 1, 2017 at 8:28 AM Caitlin Potter <[email protected] 
>> <javascript:>> wrote:
>>
>>>
>>> On Sep 1, 2017, at 1:57 AM, Yang Guo <[email protected] <javascript:>> 
>>> wrote:
>>>
>>> I don't like strong-rooting a JSFunction either. You will get some 
>>> issues with the serializer when creating the snapshot.
>>>
>>>
>>> Fair enough, but then we do need a way to enter the stub cleanly.
>>>
>>> I also feel like I don't understand the problem. We run the 
>>> MicrotaskQueue when we are about to return to the embedder with re-entrancy 
>>> level of 0 
>>> <https://cs.chromium.org/chromium/src/v8/src/isolate.cc?type=cs&sq=package:chromium&l=3291>,
>>>  
>>> so we already need to leave JS and enter JS once. With the micro-benchmark, 
>>> only one microtask gets queued per microtask run, so you don't benefit from 
>>> staying in JS. Of course you could run the MicrotaskQueue before leaving 
>>> JS, but that doesn't seem like what you are trying to do.
>>>
>>>
>>> Even though only one task is enqueued at a time, RunMicrotasks() never 
>>> finishes until the entire benchmark is run (since each resume enqueues 
>>> another task). Many tasks are run in a single JS entry, whereas in v8 ToT 
>>> you enter/exit JS thousands of times.
>>>
>>> Cheers,
>>>
>>> Yang
>>>
>>> On Fri, Sep 1, 2017 at 7:29 AM Benedikt Meurer <[email protected] 
>>> <javascript:>> wrote:
>>>
>>>> I don't think strong-rooting a JSFunction is a good idea. That might 
>>>> solve one problem, but will likely create N+1 different problems in other 
>>>> places.
>>>>
>>>> I've been discussing this with +Jaroslav Sevcik <javascript:> and we 
>>>> probably don't understand the underlying problem. So let's try to get that 
>>>> first:
>>>> You're porting the microtask queue pumping to the CSA now, and you need 
>>>> to call into Blink C++ functions and JSFunctions from that. But there's 
>>>> also still the C++ implementation of RunMicrotaskQueue still. Is that 
>>>> correct?
>>>>
>>>> -- Benedikt
>>>>
>>>> On Fri, Sep 1, 2017 at 7:03 AM Caitlin Potter <[email protected] 
>>>> <javascript:>> wrote:
>>>>
>>>>> I'm unclear on what you mean regarding code duplication.
>>>>>
>>>>> It's about ensuring A) fixed registers (e.g. the root array register) 
>>>>> are initialized properly to avoid access violations when heap constants 
>>>>> are 
>>>>> used, and to make sure callee-saved regs are actually saved and restored.
>>>>>
>>>>> If I can strong-root a full JSFunction and just use the ordinary 
>>>>> JSEntryStub, as Adam suggested, it may be a non-issue.
>>>>>
>>>>> On Aug 31, 2017, at 11:00 PM, Benedikt Meurer <[email protected] 
>>>>> <javascript:>> wrote:
>>>>>
>>>>> I like the idea of being able to run microtasks from CSA land. Not 
>>>>> sure about this JSEntryStub businesses tho, it all sounds dangerous to 
>>>>> me. 
>>>>> Is this just to avoid some code duplication between CSA and C++? If so, 
>>>>> then I'd strongly recommend against it and just duplicate that logic for 
>>>>> now. If not then I have probably misunderstood the problem.
>>>>>
>>>>> -- Benedikt
>>>>>
>>>>> On Fr., 1. Sep. 2017, 03:35 Adam Klein <[email protected] 
>>>>> <javascript:>> wrote:
>>>>>
>>>>>> On Thu, Aug 31, 2017 at 5:52 PM, Adam Klein <[email protected] 
>>>>>> <javascript:>> wrote:
>>>>>>
>>>>>>> Hi Caitlin,
>>>>>>>
>>>>>>> Jakob and I just spent some time digging into this, comments inline 
>>>>>>> (though we don't have answers to everything).
>>>>>>>
>>>>>>> On Thu, Aug 31, 2017 at 9:01 AM, <[email protected] <javascript:>> 
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> Recently I've been trying out some things to get more out of 
>>>>>>>> Promises and async functions.
>>>>>>>>
>>>>>>>> Different people in/around the Node.js project have been writing 
>>>>>>>> various benchmarks which show cases where `await`
>>>>>>>> seems to slow things down significantly. One simple example is
>>>>>>>>
>>>>>>>> https://github.com/tc39/proposal-async-iteration/issues/112#issuecomment-324885954.
>>>>>>>>  
>>>>>>>> While it's invalid to compare
>>>>>>>> the simple synchronous loop to the one with `await`, it does 
>>>>>>>> highlight that in situations like that, the v8 implementation
>>>>>>>> can seem to be very slow, when really it should be more similar to 
>>>>>>>> the sync. loop (~20 times slower seems like a steeper
>>>>>>>> price to pay than is necessary).
>>>>>>>>
>>>>>>>> I drafted an informal document to come up with some ideas for 
>>>>>>>> speeding up Await in v8. In general, the solutions were
>>>>>>>> split into 2 categories:
>>>>>>>>
>>>>>>>> 1) reduce heap use and GC overhead (allocate fewer objects for 
>>>>>>>> Await).
>>>>>>>> 2) avoid JS->C and C->JS transitions where possible (mainly 
>>>>>>>> accomplished by translating
>>>>>>>>     Isolate::RunMicrotasksInternal() and 
>>>>>>>> Isolate::EnqueueMicrotask() into code stubs). This generally makes 
>>>>>>>> JS-defined
>>>>>>>>     microtasks (for Promises and Await) much faster, but may cause 
>>>>>>>> DOM-defined microtasks to slow down a bit (unclear
>>>>>>>>     at this time). I expect Promises and Await to be used more 
>>>>>>>> frequently in tight loops, and certainly DOM microtasks don't
>>>>>>>>     affect Node.js at all, so this may be something worth going 
>>>>>>>> after.
>>>>>>>>
>>>>>>>> The first approach did not make much of a dent in any benchmarks. 
>>>>>>>> More useful profiles of actual applications did not
>>>>>>>> show `await` to be a bottleneck at all. Reducing overall memory use 
>>>>>>>> seems like a good thing in general, however.
>>>>>>>>
>>>>>>>> The second approach yielded a significant improvement (~60% over 10 
>>>>>>>> runs) for the simple benchmark (in a very
>>>>>>>> simple prototype implementation with some limitations discussed 
>>>>>>>> below).
>>>>>>>>
>>>>>>>> So there are some constraints WRT implementing RunMicrotasks in 
>>>>>>>> JIT'd code. Particularly, it needs to be possible to
>>>>>>>> perform RunMicrotasks() when no context has been entered. I've 
>>>>>>>> tried a few things to work around this:
>>>>>>>>
>>>>>>>> Initially, I had wrote the stub with JS linkage, and used the 
>>>>>>>> typical JSEntryStub to invoke it. This is partly
>>>>>>>> wasteful, and partly problematic. There need not be a separate 
>>>>>>>> JSFunction for RunMicrotasks in each
>>>>>>>> context. More importantly, the function ought not to be associated 
>>>>>>>> with a context at all, given the
>>>>>>>> constraint that it must be possible to invoke it without a context 
>>>>>>>> having been entered.
>>>>>>>>
>>>>>>>
>>>>>>> From looking at the JSEntryStub codepath for JSFunctions, it appears 
>>>>>>> to us that for a function marked as native, strict, or both (which 
>>>>>>> seems 
>>>>>>> appropriate in this case) there shouldn't be any need for a Context. So 
>>>>>>> it 
>>>>>>> seems like you could unblock your prototype by creating a single 
>>>>>>> JSFunction 
>>>>>>> (as a strong root on the Heap) which wraps the builtin, and call 
>>>>>>> through 
>>>>>>> that from the C++ API. If you already tried something like this and ran 
>>>>>>> into trouble it'd be interesting to hear what went wrong.
>>>>>>>  
>>>>>>>
>>>>>>>> A second approach involved creating new TF operators to initialize 
>>>>>>>> the roots register (the main
>>>>>>>> manifestation of problems when not using the JSEntryStub was that 
>>>>>>>> the roots register was not initialized,
>>>>>>>> leading to access violations when using heap constants). I didn't 
>>>>>>>> spend much time with this, because I
>>>>>>>> felt that it was more important to make sure callee-saved registers 
>>>>>>>> were restored properly, even though
>>>>>>>> there wasn't much going on in the sole caller of the function.  I 
>>>>>>>> thought it might be interesting to produce
>>>>>>>> more general operators which would handle entry and exit for stubs 
>>>>>>>> which need to be invoked from C,
>>>>>>>> but it seemed like a lot of work and I haven't gotten around to 
>>>>>>>> doing this yet.
>>>>>>>>
>>>>>>>> Finally, I tried adding a new variant to JSEntryStub, which call 
>>>>>>>> the RunMicrotasks stub rather than the various entry
>>>>>>>> trampolines. At this moment, it's mostly in working order, but it's 
>>>>>>>> possible there are still problems with
>>>>>>>> StackFrameIteration and exception handling.
>>>>>>>>
>>>>>>>
>>>>>>> These approaches seem too involved just for this one case, I'd 
>>>>>>> prefer the JSFunction approach above if it works.
>>>>>>>  
>>>>>>>
>>>>>>>> Another limitation is, previously SaveContexts (which seem to 
>>>>>>>> matter to the debugger and API in some way, though I
>>>>>>>> haven't really looked at why yet) were not set up when calling 
>>>>>>>> API-defined microtask callbacks. In my prototype, I
>>>>>>>> always set up the SaveContext before entering the RunMicrotasks 
>>>>>>>> stub. It's yet unclear if this breaks anything, or if it
>>>>>>>> would be possible (or even a good idea) to mimic the old behaviour 
>>>>>>>> in the stub rather than always pushing the SaveContext.
>>>>>>>> This is a subtle difference, but as noted it could have some bad 
>>>>>>>> effects.
>>>>>>>>
>>>>>>>
>>>>>>> Still digging into this. It appears I may have inadvertently removed 
>>>>>>> the regression test for this code in 
>>>>>>> https://codereview.chromium.org/1909433003 when I removed support 
>>>>>>> for Object.observe, but the regression test should be able to be 
>>>>>>> adapted 
>>>>>>> for Promises (see https://codereview.chromium.org/332923003 for the 
>>>>>>> test). I'm going to try restoring the test and playing around with this 
>>>>>>> code in the current C++ version to see if I can get a better handle on 
>>>>>>> it. 
>>>>>>> But from an initial reading, it really shouldn't make a difference for 
>>>>>>> the 
>>>>>>> C++ callback case anyway.
>>>>>>>
>>>>>>
>>>>>> Looking more deeply at this (and at 
>>>>>> https://bugs.chromium.org/p/chromium/issues/detail?id=385349, the 
>>>>>> bug for that regression test), I'm not convinced this is actually 
>>>>>> necessary 
>>>>>> anymore. +yangguo in case he has a less fuzzy memory than me (this was 
>>>>>> all 
>>>>>> >3 years ago).
>>>>>>  
>>>>>>
>>>>>>> Finally, a somewhat strange behaviour of the stub is that it enters 
>>>>>>>> contexts by itself when it needs to, inlining
>>>>>>>> HandleScopeImplementer::EnterMicrotaskContext and 
>>>>>>>> LeaveMicrotaskContext(), and overwriting Isolate::context().
>>>>>>>> I believe this is done in a valid way in the prototype, but it's 
>>>>>>>> not something that comes up in other stubs, so there isn't really
>>>>>>>> any other code to model it on.
>>>>>>>>
>>>>>>>
>>>>>>> This seems fine to me, it's rather special behavior even in its C++ 
>>>>>>> form.
>>>>>>>  
>>>>>>>
>>>>>>>> ---
>>>>>>>>
>>>>>>>> I was wondering if anyone thought reducing the C++->JS->C++ 
>>>>>>>> overhead in RunMicrotasks for that 60% boost in certain
>>>>>>>> very simple and unrepresentative-of-real-code benchmarks might be 
>>>>>>>> worth doing properly and upstreaming? While it's
>>>>>>>> unclear what the impact would be on real-world code, it seems like 
>>>>>>>> a reasonable expectation that you'd see some kind of
>>>>>>>> significant benefit (though perhaps not on the order of 60% as in 
>>>>>>>> the very simple benchmark mentioned above).
>>>>>>>>
>>>>>>>> If (in the opinion of the v8 team) it might be worth my time to try 
>>>>>>>> to upstream this, I'd love some feedback on the approaches
>>>>>>>> taken to address the problems listed above, and get an idea of what 
>>>>>>>> sort of approach you'd all be happiest with.
>>>>>>>>
>>>>>>>
>>>>>>> If we can pin down the answers to all the stuff above to our 
>>>>>>> satisfaction, then yes, my inclination is that this is a worthwhile 
>>>>>>> thing 
>>>>>>> to do: the code may be a bit verbose (what with having to deal with the 
>>>>>>> different sorts of things stored in the queue), but it's at least 
>>>>>>> relatively straightforward.
>>>>>>>
>>>>>>> - Adam
>>>>>>>
>>>>>> -- 
>>>>>> -- 
>>>>>> v8-dev mailing list
>>>>>> [email protected] <javascript:>
>>>>>> http://groups.google.com/group/v8-dev
>>>>>> --- 
>>>>>> You received this message because you are subscribed to the Google 
>>>>>> Groups "v8-dev" group.
>>>>>> To unsubscribe from this group and stop receiving emails from it, 
>>>>>> send an email to [email protected] <javascript:>.
>>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>>
>>>>> -- 
>>>>> -- 
>>>>> v8-dev mailing list
>>>>> [email protected] <javascript:>
>>>>> http://groups.google.com/group/v8-dev
>>>>> --- 
>>>>> You received this message because you are subscribed to the Google 
>>>>> Groups "v8-dev" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>>> an email to [email protected] <javascript:>.
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>>> -- 
>>>>> -- 
>>>>> v8-dev mailing list
>>>>> [email protected] <javascript:>
>>>>> http://groups.google.com/group/v8-dev
>>>>> --- 
>>>>> You received this message because you are subscribed to the Google 
>>>>> Groups "v8-dev" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>>> an email to [email protected] <javascript:>.
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>> -- 
>>>> -- 
>>>> v8-dev mailing list
>>>> [email protected] <javascript:>
>>>> http://groups.google.com/group/v8-dev
>>>> --- 
>>>> You received this message because you are subscribed to the Google 
>>>> Groups "v8-dev" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>> an email to [email protected] <javascript:>.
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>> -- 
>>>
>>>
>>>
>>> * •  *
>>> *Yang Guo** •  **Google Germany GmbH*
>>> * •  *Erika-Mann-Str. 33 
>>> <https://maps.google.com/?q=Erika-Mann-Str.+33*+**%C2%A0%E2%80%A2+%C2%A0*80636+Munich&entry=gmail&source=g>
>>> * • 
>>> <https://maps.google.com/?q=Erika-Mann-Str.+33*+**%C2%A0%E2%80%A2+%C2%A0*80636+Munich&entry=gmail&source=g>
>>>  *80636 
>>> Munich 
>>> <https://maps.google.com/?q=Erika-Mann-Str.+33*+**%C2%A0%E2%80%A2+%C2%A0*80636+Munich&entry=gmail&source=g>
>>>
>>>  • 
>>> <https://maps.google.com/?q=Erika-Mann-Str.+33*+**%C2%A0%E2%80%A2+%3Chttps://maps.google.com/?q%3DErika-Mann-Str.%2B33*%2B**%25C2%25A0%25E2%2580%25A2%2B%25C2%25A0*80636%2BMunich%26entry%3Dgmail%26source%3Dg%3E%C2%A0*80636+Munich+%3Chttps://maps.google.com/?q%3DErika-Mann-Str.%2B33*%2B**%25C2%25A0%25E2%2580%25A2%2B%25C2%25A0*80636%2BMunich%26entry%3Dgmail%26source%3Dg%3E&entry=gmail&source=g>
>>>  [email protected] <javascript:>
>>>
>>>
>>> Geschäftsführer: Paul Manicle, Halimah DeLaine Prado
>>>
>>> Registergericht und -nummer: Hamburg, HRB 86891 Sitz der Gesellschaft: 
>>> Hamburg
>>>
>>> Diese E-Mail ist vertraulich. Wenn Sie nicht der richtige Adressat sind, 
>>> leiten Sie diese bitte nicht weiter, informieren Sie den Absender und 
>>> löschen Sie die E-Mail und alle Anhänge. Vielen Dank. This e-mail is 
>>> confidential. If you are not the right addressee please do not forward it, 
>>> please inform the sender, and please erase this e-mail including any 
>>> attachments. Thanks.
>>>
>>> -- 
>>> -- 
>>> v8-dev mailing list
>>> [email protected] <javascript:>
>>> http://groups.google.com/group/v8-dev
>>> --- 
>>> You received this message because you are subscribed to the Google 
>>> Groups "v8-dev" group.
>>> To unsubscribe from this group and stop receiving emails from it, send 
>>> an email to [email protected] <javascript:>.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>> -- 
>>> -- 
>>> v8-dev mailing list
>>> [email protected] <javascript:>
>>> http://groups.google.com/group/v8-dev
>>> --- 
>>> You received this message because you are subscribed to the Google 
>>> Groups "v8-dev" group.
>>> To unsubscribe from this group and stop receiving emails from it, send 
>>> an email to [email protected] <javascript:>.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>> -- 
>> -- 
>> v8-dev mailing list
>> [email protected] <javascript:>
>> http://groups.google.com/group/v8-dev
>> --- 
>> You received this message because you are subscribed to the Google Groups 
>> "v8-dev" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected] <javascript:>.
>> For more options, visit https://groups.google.com/d/optout.
>>
>> -- 
>> -- 
>> v8-dev mailing list
>> [email protected] <javascript:>
>> http://groups.google.com/group/v8-dev
>> --- 
>> You received this message because you are subscribed to the Google Groups 
>> "v8-dev" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected] <javascript:>.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 
-- 
v8-dev mailing list
[email protected]
http://groups.google.com/group/v8-dev
--- 
You received this message because you are subscribed to the Google Groups 
"v8-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to