Have you checked the link
http://192.168.56.101:9046/proxy/application_1432817967879_0003
<http://192.168.56.101:9046/proxy/application_1432817967879_0003/Then> ?

You should get come clue from logs of the 2 attempts.

On Thu, May 28, 2015 at 6:42 AM, xeonmailinglist-gmail <
xeonmailingl...@gmail.com> wrote:

>  Hi,
>
> I am trying to launch a job that I have configured in in java, but I get
> an error related to the containers [1]. I don’t understand why I can’t
> submit the a job. Why get this error? What can I do to fix it?
>
> Thanks,
>
> [1] Log of logs/yarn-xubuntu-nodemanager-hadoop-coc-1.log
>
> 15/05/28 09:28:21 INFO client.RMProxy: Connecting to ResourceManager at 
> /192.168.56.101:8032
> 15/05/28 09:28:24 INFO input.FileInputFormat: Total input paths to process : 5
> 15/05/28 09:28:24 INFO mapreduce.JobSubmitter: number of splits:5
> 15/05/28 09:28:24 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
> job_1432817967879_0003
> 15/05/28 09:28:25 INFO impl.YarnClientImpl: Submitted application 
> application_1432817967879_0003
> 15/05/28 09:28:25 INFO mapreduce.Job: The url to track the job: 
> http://192.168.56.101:9046/proxy/application_1432817967879_0003/
> 15/05/28 09:28:25 INFO mapreduce.Job: Running job: job_1432817967879_0003
> 15/05/28 09:28:29 INFO mapreduce.Job: Job job_1432817967879_0003 running in 
> uber mode : false
> 15/05/28 09:28:29 INFO mapreduce.Job:  map 0% reduce 0%
> 15/05/28 09:28:29 INFO mapreduce.Job: Job job_1432817967879_0003 failed with 
> state FAILED due to: Application application_1432817967879_0003 failed 2 
> times due to AM Cont
> ainer for appattempt_1432817967879_0003_000002 exited with  exitCode: 1
> For more detailed output, check application tracking 
> page:http://192.168.56.101:9046/proxy/application_1432817967879_0003/Then, 
> click on links to logs of each attempt.
> Diagnostics: Exception from container-launch.
> Container id: container_1432817967879_0003_02_000001
> Exit code: 1
> Stack trace: ExitCodeException exitCode=1:
>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>         at org.apache.hadoop.util.Shell.run(Shell.java:455)
>         at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
>         at 
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
>         at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
>         at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
>
> ​
>
> --
> --
>
>

Reply via email to