Can you share the following info from your environment so that it can better 
help us in helping you with this issue. Simply looking at Java exceptions may 
not be enough 
Hadoop versionJava version Memory allocations and heap size for node manager 
and containers How the job was run,  Hive,  Spark or Java MR code Have you 
restarted node manager 
Thanks Musty 

Sent from Yahoo Mail on Android 
 
  On Sat, Apr 2, 2016 at 1:15 PM, 169517388<[email protected]> wrote:   to 
hadoop.org:
    Hello hadoop.org. I'm the new guy who is learning the hadoop right now. I'v 
bulit a 5 nodes hadoop experimental environment. When I ran the MR program, the 
error came out.    I searched a lot. Maybe the java runtime environment or 
anything else. I didn't get it.
    16/04/02 16:03:37 INFO mapreduce.Job: Task Id : 
attempt_1459528700872_0003_m_000000_1, Status : FAILED Exception from 
container-launch: ExitCodeException exitCode=65: 
ExitCodeException exitCode=65: 
 at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
 at org.apache.hadoop.util.Shell.run(Shell.java:455)
 at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
 at 
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
 at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
 at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
 at java.util.concurrent.FutureTask.run(FutureTask.java:262)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)
    I just don't know how to resolve it.
    By the way, is their any forum about hadoop. I want to talk more about the 
hadoop with some friends.     So please, I'v already crazy, beause I'v been 
disturb by this problem for one week.    Please help me, thank you very much.
    
    
HY FrankFrom China Shanghai  

Reply via email to