Not sure that was what I want.  I tried to run Spark Shell on a machine other
than the master and got the same error.  The "192" was suppose to be a
simple shell script change that alters SPARK_HOME before submitting jobs. 
Too bad it wasn't there anymore.

The build described in the pull request (16440) seemed failed.  So, I can't
use it.

I am looking for those shell script changes.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/executor-failed-cannot-find-compute-classpath-sh-tp859p9277.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to