Hi,

On Fri, 18 Dec 2020 at 19:15, muke101 <muke...@protonmail.com> wrote:
> Thanks both of you for getting back to me, these definitely seem like 
> problems worth thinking about first. Looking into it, there has actually been 
> some research already on implementing Polyhedral optimisations in a JIT 
> optimiser, specifically in JavaScript. It's paper 
> (http://impact.gforge.inria.fr/impact2018/papers/polyhedral-javascript.pdf) 
> seems to point out the same problems you both bring up, like SCoP detection 
> and aliasing, and how it worked around them.
>
> For now then I'll try and consider how ambitious replicating these solutions 
> would be and if they would map into PyPy from JS cleanly - please let me know 
> if any other hurdles come to mind in the meantime though.

I assume that by "JavaScript" you mean JavaScript with a method-based
JIT compiler.  At this level, that's the main difference with PyPy,
which contains RPython's tracing JIT compiler instead.  The fact that
they are about the JavaScript or Python language is not that
important.

Here's another idea about how to do more advanced optimizations in a
tracing JIT a la PyPy.  The idea would be to keep enough metadata for
the various pieces of machine code that the current backend produces,
and add logic to detect when this machine code runs for long enough.
At that point, we would involve a (completely new) second level
backend, which would consolidate the pieces into a better-optimized
whole.  This is an idea that exists in method JITs but that should
also work in tracing JITs: the second level backend can see all the
common paths at once, instead of one after the other.  The second
level can be slower (within reason), and it can even know how common
each path is, which might give it an edge over ahead-of-time
compilers.


A bientôt,

Armin.
_______________________________________________
pypy-dev mailing list
pypy-dev@python.org
https://mail.python.org/mailman/listinfo/pypy-dev

Reply via email to