[
https://issues.apache.org/jira/browse/SPARK-1985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen updated SPARK-1985:
-----------------------------
Component/s: (was: Spark Core)
Mesos
Labels: (was: mesos)
The code in question at that point in time was:
{code}
val sparkHome = sc.getSparkHome().getOrElse(throw new SparkException(
"Spark home is not set; set it through the spark.home system " +
"property, the SPARK_HOME environment variable or the SparkContext
constructor"))
{code}
and it's now
{code}
val executorSparkHome = sc.conf.getOption("spark.mesos.executor.home")
.orElse(sc.getSparkHome()) // Fall back to driver Spark home for backward
compatibility
.getOrElse {
throw new SparkException("Executor Spark home
`spark.mesos.executor.home` is not set!")
}
{code}
So {{SPARK_HOME}} / {{spark.home}} are no longer required, although, they've
just been replaced with another more specific value in SPARK-3264 /
https://github.com/apache/spark/commit/41dc5987d9abeca6fc0f5935c780d48f517cdf95
Although the assembly is automatically added to the classpath by
{{compute-classpath.sh}} too, that may not be 100% of what this is asking,
which is to be able to not set a 'home' at all.
My read of SPARK-3264 however is that we should have an explicit 'home' setting
for Mesos executors. Or else I'm not clear how you find `bin/spark-class` for
example (see the relevant change in
https://github.com/apache/spark/commit/4a4f9ccba2b42b64356db7f94ed9019212fc7317
too)
> SPARK_HOME shouldn't be required when spark.executor.uri is provided
> --------------------------------------------------------------------
>
> Key: SPARK-1985
> URL: https://issues.apache.org/jira/browse/SPARK-1985
> Project: Spark
> Issue Type: Bug
> Components: Mesos
> Affects Versions: 1.0.0
> Environment: MESOS
> Reporter: Gerard Maas
>
> When trying to run that simple example on a Mesos installation, I get an
> error that "SPARK_HOME" is not set. A local spark installation should not be
> required to run a job on Mesos. All that's needed is the executor package,
> being the assembly.tar.gz on a reachable location (HDFS/S3/HTTP).
> I went looking into the code and indeed there's a check on SPARK_HOME [2]
> regardless of the presence of the assembly but it's actually only used if the
> assembly is not provided (which is a kind-of best-effort recovery strategy).
> Current flow:
> if (!SPARK_HOME) fail("No SPARK_HOME")
> else if (assembly) { use assembly) }
> else { try use SPARK_HOME to build spark_executor }
> Should be:
> sparkExecutor = if (assembly) {assembly}
> else if (SPARK_HOME) {try use SPARK_HOME to build
> spark_executor}
> else { fail("No executor found. Please provide
> spark.executor.uri (preferred) or spark.home")
> [1]
> http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-with-Spark-Mesos-spark-shell-works-fine-td6165.html
> [2]
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala#L89
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]