GitHub user tnachen opened a pull request: https://github.com/apache/spark/pull/10332
[SPARK-12345][MESOS] Filter SPARK_HOME when submitting Spark jobs with Mesos cluster mode. SPARK_HOME is now causing problem with Mesos cluster mode since spark-submit script has been changed recently to take precendence when running spark-class scripts to look in SPARK_HOME if it's defined. We should skip passing SPARK_HOME from the Spark client in cluster mode with Mesos, since Mesos shouldn't use this configuration but should use spark.executor.home instead. You can merge this pull request into a Git repository by running: $ git pull https://github.com/tnachen/spark scheduler_ui Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/10332.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #10332 ---- commit baea28f54406a58ae313d1a8428d985e70b3116a Author: Timothy Chen <tnac...@gmail.com> Date: 2015-12-16T16:45:34Z Filter SPARK_HOME when submitting Spark jobs with Mesos cluster mode. ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org