This means the spark workers exited with code "15"; probably nothing YARN 
related itself (unless there are classpath-related problems). 

Have a look at the logs of the app/container via the resource manager. You can 
also increase the time that logs get kept on the nodes themselves to something 
like 10 minutes or longer

<property>
  <name>yarn.nodemanager.delete.debug-delay-sec</name>
  <value>600</value>
</property>



> On 8 Apr 2015, at 07:24, sachin Singh <sachin.sha...@gmail.com> wrote:
> 
> Hi ,
> I observed that we have installed only one cluster,
> and submiting job as yarn-cluster then getting below error, so is this cause
> that installation is only one cluster?
> Please correct me, if this is not cause then why I am not able to run in
> cluster mode,
> spark submit command is -
> spark-submit --jars ....some dependent jars... --master yarn --class
> com.java.jobs.sparkAggregation mytest-1.0.0.jar 
> 
> 2015-04-08 19:16:50 INFO  Client - Application report for
> application_1427895906171_0087 (state: FAILED)
> 2015-04-08 19:16:50 DEBUG Client - 
>        client token: N/A
>        diagnostics: Application application_1427895906171_0087 failed 2 times 
> due
> to AM Container for appattempt_1427895906171_0087_000002 exited with 
> exitCode: 15 due to: Exception from container-launch.
> Container id: container_1427895906171_0087_02_000001
> Exit code: 15
> Stack trace: ExitCodeException exitCode=15: 
>       at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>       at org.apache.hadoop.util.Shell.run(Shell.java:455)
>       at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
>       at
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:197)
>       at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:299)
>       at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>       at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> 
> 
> Container exited with a non-zero exit code 15
> .Failing this attempt.. Failing the application.
>        ApplicationMaster host: N/A
>        ApplicationMaster RPC port: -1
>        queue: root.hdfs
>        start time: 1428500770818
>        final status: FAILED
> 
> 
> Exception in thread "main" org.apache.spark.SparkException: Application
> finished with failed status
>        at
> org.apache.spark.deploy.yarn.ClientBase$class.run(ClientBase.scala:509)
>        at org.apache.spark.deploy.yarn.Client.run(Client.scala:35)
>        at org.apache.spark.deploy.yarn.Client$.main(Client.scala:139)
>        at org.apache.spark.deploy.yarn.Client.main(Client.scala)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:606)
>        at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/need-info-on-Spark-submit-on-yarn-cluster-mode-tp22420.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to