I did take a look at Graal a while back when thinking about how execution
environments could be defined, my concerns were related to it not
supporting all of the features of a language.
For example, its typical for Python to load and call native libraries and
Graal can only execute C/C++ code that has been compiled to LLVM.
Also, a good amount of people interested in using ML libraries will want
access to GPUs to improve performance which I believe that Graal can't
support.

It can be a very useful way to run simple lamda functions written in some
language directly without needing to use a docker environment but you could
probably use something even lighter weight then Graal that is language
specific like Jython.

On Thu, May 3, 2018 at 10:05 PM Romain Manni-Bucau <rmannibu...@gmail.com>
wrote:

> Hi guys
>
> Since some time there are efforts to have a language portable support in
> beam but I cant really find a case it "works" being based on docker except
> for some vendor specific infra.
>
> Current solution:
>
> 1. Is runner intrusive (which is bad for beam and prevents adoption of big
> data vendors)
> 2. Based on docker (which assumed a runtime environment and is very
> ops/infra intrusive and likely too $$ quite often for what it brings)
>
> Did anyone had a look to graal which seems a way to make the feature
> doable in a lighter manner and optimized compared to default jsr223 impls?
>
>

Reply via email to