On Wed, Aug 16, 2017 at 2:58 PM, Meisam Fathi <[email protected]> wrote: > Livy can directly call org.apache.spark.deploy.SparkSubmit.main() with > proper arguments, which is what spark-submit ends up doing. > > I have at least three problems with this approach: > 1. It is a hack. > 2. Now that you pointed out, I see it restricts Livy to a single version of > Spark. > 3. It becomes tricky to separate the output of different applications > because all outputs will go to the stdout/stderr of Livy process.
That basically describes what SPARK-11035 would do. I can think of 2 ways of doing that while still supporting multiple Spark versions, one hacky and one very hacky: - Use different class loaders to run different Spark versions. - Use different processes to load different Spark versions, and make the Livy server process communicate with them when launching applications. Neither really solves #3. Not sure I have any good ideas of how to fix that one. -- Marcelo
