Mostly your JAVA_HOME variable is wrong. Can you configure that in sparkenv file.
Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi <https://twitter.com/mayur_rustagi> On Tue, May 6, 2014 at 5:53 PM, Sophia <sln-1...@163.com> wrote: > Hi all, > [root@sophia spark-0.9.1]# > > SPARK_JAR=.assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar > ./bin/spark-class org.apache.spark.deploy.yarn.Client\--jar > examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.1.jar\--class > org.apache.spark.examples.SparkPi\--args yarn-standalone \--num-workers 3 > \--master-memory 2g \--worker-memory 2g \--worker-cores 1 > ./bin/spark-class: line 152: /usr/java/jdk1.7.0_25/bin/java: No such file > or > directory > ./bin/spark-class: line 152: exec: /usr/java/jdk1.7.0_25/bin/java: cannot > execute: No such file or directory > If it due to my file has been breakdown? > How can I do with it? > Best regards, > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/If-it-due-to-my-file-has-been-breakdown-tp5438.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >