[ 
https://issues.apache.org/jira/browse/SPARK-1985?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14127455#comment-14127455
 ] 

Chip Senkbeil commented on SPARK-1985:
--------------------------------------

Does anyone know what the status of this is?

> SPARK_HOME shouldn't be required when spark.executor.uri is provided
> --------------------------------------------------------------------
>
>                 Key: SPARK-1985
>                 URL: https://issues.apache.org/jira/browse/SPARK-1985
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>         Environment: MESOS
>            Reporter: Gerard Maas
>              Labels: mesos
>
> When trying to run that simple example on a  Mesos installation,  I get an 
> error that "SPARK_HOME" is not set. A local spark installation should not be 
> required to run a job on Mesos. All that's needed is the executor package, 
> being the assembly.tar.gz on a reachable location (HDFS/S3/HTTP).
> I went looking into the code and indeed there's a check on SPARK_HOME [2] 
> regardless of the presence of the assembly but it's actually only used if the 
> assembly is not provided (which is a kind-of best-effort recovery strategy).
> Current flow:
> if (!SPARK_HOME) fail("No SPARK_HOME") 
> else if (assembly) { use assembly) }
> else { try use SPARK_HOME to build spark_executor } 
> Should be:
> sparkExecutor =  if (assembly) {assembly} 
>                  else if (SPARK_HOME) {try use SPARK_HOME to build 
> spark_executor}
>                  else { fail("No executor found. Please provide 
> spark.executor.uri (preferred) or spark.home")
> [1] 
> http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-with-Spark-Mesos-spark-shell-works-fine-td6165.html
> [2] 
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala#L89



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to