Hi CJ,

Looks like I overlook a few lines in the spark shell case. It appears that
spark shell
explicitly overwrites
<https://github.com/apache/spark/blob/f4f46dec5ae1da48738b9b650d3de155b59c4674/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L955>
"spark.home" to whatever "SPARK_HOME" is set to. I have filed
a JIRA to track this issue: https://issues.apache.org/jira/browse/SPARK-2454.
Until
then, you can workaround this by ensuring we don't set SPARK_HOME anywhere.

We currently do this in bin/spark-submit and bin/spark-class, so go ahead
and
remove these lines:

https://github.com/apache/spark/blob/f4f46dec5ae1da48738b9b650d3de155b59c4674/bin/spark-submit#L20
https://github.com/apache/spark/blob/f4f46dec5ae1da48738b9b650d3de155b59c4674/bin/spark-class#L31

In addition, make sure spark-submit no longer uses the SPARK_HOME variable
here:

https://github.com/apache/spark/blob/f4f46dec5ae1da48738b9b650d3de155b59c4674/bin/spark-submit#L44

Now, as you have done before, setting "spark.home" to your executor's spark
home
should do the job. I have verified that this solves the problem in my own
cluster.

To verify that your configs are in fact set, you can always run
bin/spark-submit (or
spark-shell, which calls spark-submit) with the --verbose flag.

Let me know if this fixes it. I will get to fixing the root problem soon.

Andrew



2014-07-10 18:43 GMT-07:00 cjwang <c...@cjwang.us>:

> Andrew,
>
> Thanks for replying.  I did the following and the result was still the
> same.
>
> 1. Added "spark.home /root/spark-1.0.0" to local conf/spark-defaults.conf,
> where "/root...." was the place in the cluster where I put Spark.
>
> 2. Ran "bin/spark-shell --master
> spark://sjc1-eng-float01.carrieriq.com:7077".
>
> 3. Sighed when I still saw the same error:
>
> 14/07/10 18:26:53 INFO AppClient$ClientActor: Executor updated:
> app-20140711012651-0007/5 is now FAILED (class java.io.IOException: Cannot
> run program "/Users/cwang/spark/bin/compute-classpath.sh" (in directory
> "."): error=2, No such file or directory)
>
> /Users/cwang/spark was my local SPARK_HOME, which is wrong.
>
> What did I do wrong?  How do I know if the config file is taken?
>
> I am novice to Spark so spare with me.
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/executor-failed-cannot-find-compute-classpath-sh-tp859p9378.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to