On Fri, Mar 9, 2018 at 1:36 AM, Matteo Durighetto
<[email protected]> wrote:
>           I think it's correct that the Livy Admin manages the multiple
> version of spark, but the user need to choose what version to use
> to submit the job.
...
> So a Livy Admin could manage the configuration and a Datascience or a Dev
> could submit the job calling the "alias" ( i.e. spak_1.6 or spark_2.1 or
> spark 2.2 ) to a different spark / java
> and test different env for they applications or machine learning project.

That sounds closer to what I had in mind originally. User asks for a
specific version of Spark using a name defined by the admin, instead
of providing an explicit SPARK_HOME env variable or something like
that.


-- 
Marcelo

Reply via email to