Hi All,

We are receiving the following error when submitting the job to a yarn
cluster. We have HADOOP_CONF_DIR set, as well as the SPARK_HOME env
variable, which contain the config for our hadoop/yarn cluster.

/usr/local/Cellar/spark/spark-1.5.2-bin-without-hadoop/bin/spark-submit
--master yarn-cluster --deploy-mode cluster spark-demo-all.jar
hdfs://node01.test.local/some/data/path/*

First error:
15/12/02 10:03:08 INFO yarn.Client: Source and destination file systems are
the same. Not copying
file:/usr/local/Cellar/spark/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar
15/12/02 10:03:08 INFO yarn.Client: Source and destination file systems are
the same. Not copying
file:/Users/test/Documents/workspace/messing/spark-demo/build/libs/spark-demo-all.jar
15/12/02 10:03:08 INFO yarn.Client: Source and destination file systems are
the same. Not copying
file:/private/var/folders/vz/98vxzfxx7mn7g5kkmww_tk3jgx53gq/T/spark-f3ecc91c-e4e8-4f31-9f09-46f9ed763f70/__spark_conf__6763843664319520189.zip

Second error:
diagnostics: Application application_1448959809546_0014 failed 2 times due
to AM Container for appattempt_1448959809546_0014_000002 exited with
 exitCode: -1000
For more detailed output, check application tracking page:HYPERLINK "
http://node02.test.local:8088/cluster/app/application_1448959809546_0014Then%2c%20click%20on%20links%20to%20logs%20of%20each%20attempt
"
http://node02.test.local:8088/cluster/app/application_1448959809546_0014Then,
click on links to logs of each attempt.
Diagnostics: java.io.FileNotFoundException: File
file:/usr/local/Cellar/spark/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar
does not exist
Failing this attempt. Failing the application.

If we submit the same job onto one of the Ambari/hadoop/spark nodes, the
job runs fine. But submitting the same job from a client external to the
cluster fails with the above error messages.

Is there a work around for this?

Thanks!

Reply via email to