2018-03-09 1:39 GMT+01:00 Marcelo Vanzin <[email protected]>:

> On Thu, Mar 8, 2018 at 4:33 PM, Kevin Risden <[email protected]> wrote:
> > I started looking into LIVY-414 [1] with the exact intention of multiple
> > Spark versions with multiple Spark homes. Might see if that would help
> here?
>
> I consider those orthogonal. While there may be uses for setting env
> variables, supporting multiple Spark versions / homes should be
> something that only the Livy admin should be able to do, instead of
> being something that can be easily overridden by the user e.g. by
> setting some malicious SPARK_HOME for Livy to use.
>
> --
> Marcelo
>

Hello,
          I think it's correct that the Livy Admin manages the multiple
version of spark, but the user need to choose what version to use
to submit the job.

So an idea could be to have a json in the livy conf to manage multiple
configurations.

For example , instead to have :

livy.server.spark-home = /path/to/spark/home

something like :

livy.server.spark-env = { "spark_1.6" :
                                                         {
                                                           "spark_jdk"    :
"/path/to/jdk_home",
                                                           "spark_home" :
"/path/to/spark/home1.6" ,

 "spark_submit_exec"  : "spark-submit"
                                                         },
                                        "spark_2.1" :
                                                         {
                                                           "spark_jdk1.8u144"
  : "/path/to/jdk_home_18u144",
                                                           "spark_home" :
"/path/to/spark/home2.1" ,

 "spark_submit_exec"  : "spark2.1-submit"
                                                         },
                                       "spark_2.2" :
                                                         {
                                                           "spark_jdk1.8u161"
  : "/path/to/jdk_home_18u161",
                                                           "spark_home" :
"/path/to/spark/home2.2" ,

 "spark_submit_exec"  : "spark2.2-submit"
                                                         }
                                      }

So a Livy Admin could manage the configuration and a Datascience or a Dev
could submit the job calling the "alias" ( i.e. spak_1.6 or spark_2.1 or
spark 2.2 ) to a different spark / java
and test different env for they applications or machine learning project.


What do you think about ?


Kind Regards


Matteo

Reply via email to