Firstly , thanks for your help. I just would like to ask why is it important to start the hadoop job history server for the oozie jobs to run successfully ?
On Mon, Sep 7, 2015 at 11:22 AM, Oussama Chougna <[email protected]> wrote: > You can get this error if your Hadoop Job History server is not running. > Start the Job History server on hadoop and try again. > > Cheers, > Oussama Chougna > > > Date: Sun, 6 Sep 2015 20:47:42 +0200 > > Subject: java.net.ConnectExceptionappears after running java-main > example in Oozie > > From: [email protected] > > To: [email protected] > > > > Dears , > > > > I just have built and setup Oozie 4.1.0. I'm trying to run the examples > > bundled with the source code. > > > > Checking the status of oozie returns normal > > > > I uploaded examples folder to hdfs > > > > I tried to submit two of the examples bundled (map reduce , java main) > but > > they both failed for different reasons. > > > > The java main example was run through this command > > oozie job -config apps/java-main/job.properties -run > > > > *Job properties :* > > > > nameNode=hdfs://localhost:9000 > > jobTracker=localhost:9001 > > queueName=default > > examplesRoot=examples > > > > oozie.wf.application.path=${nameNode}/user/${user.name > > }/${examplesRoot}/apps/java-main > > > > > > *Workflow :* > > > > > > *<workflow-app xmlns="uri:oozie:workflow:0.2" name="java-main-wf"> > > <start to="java-node"/> <action name="java-node"> <java> > > <job-tracker>${jobTracker}</job-tracker> > > <name-node>${nameNode}</name-node> <configuration> > > <property> <name>mapred.job.queue.name > > <http://mapred.job.queue.name></name> > > <value>${queueName}</value> </property> > > </configuration> > > <main-class>org.apache.oozie.example.DemoJavaMain</main-class> > > <arg>Hello</arg> <arg>Oozie!</arg> </java> <ok > > to="end"/> <error to="fail"/> </action> <kill name="fail"> > > <message>Java failed, error > > message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> > > <end name="end"/></workflow-app>* > > > > but I got this exception in the logs after waiting for some time. > > > > 2015-09-06 20:16:27,965 WARN ActionStartXCommand:544 - > > SERVER[akamal-laptop] USER[hduser] GROUP[-] TOKEN[] APP[java-main-wf] > > JOB[0000004-150906115439360-oozie-hdus-W] > > ACTION[0000004-150906115439360-oozie-hdus-W@java-node] Error starting > > action [java-node]. ErrorType [TRANSIENT], ErrorCode [ JA006], Message [ > > JA006: Call From akamal-laptop/127.0.1.1 to localhost:9001 failed on > > connection exception: java.net.ConnectException: Connection refused; For > > more details see: http://wiki.apache.org/hadoop/ConnectionRefused] > > org.apache.oozie.action.ActionExecutorException: JA006: Call From > > akamal-laptop/127.0.1.1 to localhost:9001 failed on connection > exception: > > java.net.ConnectException: Connection refused; For more details see: > > http://wiki.apache.org/hadoop/ConnectionRefused > > at > > > org.apache.oozie.action.ActionExecutor.convertExceptionHelper(ActionExecutor.java:412) > > at > > > org.apache.oozie.action.ActionExecutor.convertException(ActionExecutor.java:392) > > at > > > org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:979) > > at > > > org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1134) > > at > > > org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:228) > > at > > > org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:63) > > at org.apache.oozie.command.XCommand.call(XCommand.java:281) > > at > > > org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174) > > at > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > at java.lang.Thread.run(Thread.java:745) > > Caused by: java.net.ConnectException: Call From akamal-laptop/127.0.1.1 > to > > localhost:9001 failed on connection exception: java.net.ConnectException: > > Connection refused; For more details see: > > http://wiki.apache.org/hadoop/ConnectionRefused > > at sun.reflect.GeneratedConstructorAccessor73.newInstance(Unknown Source) > > at > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > > at java.lang.reflect.Constructor.newInstance(Constructor.java:526) > > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) > > at org.apache.hadoop.ipc.Client.call(Client.java:1410) > > at org.apache.hadoop.ipc.Client.call(Client.java:1359) > > at > > > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) > > at com.sun.proxy.$Proxy27.getDelegationToken(Unknown Source) > > at > > > org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getDelegationToken(ApplicationClientProtocolPBClientImpl.java:256) > > at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:606) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) > > at com.sun.proxy.$Proxy28.getDelegationToken(Unknown Source) > > at > > > org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getRMDelegationToken(YarnClientImpl.java:319) > > at > > > org.apache.hadoop.mapred.ResourceMgrDelegate.getDelegationToken(ResourceMgrDelegate.java:162) > > at > > > org.apache.hadoop.mapred.YARNRunner.getDelegationToken(YARNRunner.java:219) > > at > org.apache.hadoop.mapreduce.Cluster.getDelegationToken(Cluster.java:400) > > at org.apache.hadoop.mapred.JobClient$16.run(JobClient.java:1203) > > at org.apache.hadoop.mapred.JobClient$16.run(JobClient.java:1200) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) > > at > > > org.apache.hadoop.mapred.JobClient.getDelegationToken(JobClient.java:1199) > > at > > > org.apache.oozie.service.HadoopAccessorService.createJobClient(HadoopAccessorService.java:375) > > at > > > org.apache.oozie.action.hadoop.JavaActionExecutor.createJobClient(JavaActionExecutor.java:1177) > > at > > > org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:927) > > ... 8 more > > Caused by: java.net.ConnectException: Connection refused > > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) > > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:740) > > at > > > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) > > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) > > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) > > at > org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) > > at > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) > > at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) > > at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) > > at org.apache.hadoop.ipc.Client.call(Client.java:1377) > > ... 31 more > > 2015-09-06 20:16:27,970 INFO ActionStartXCommand:541 - > > SERVER[akamal-laptop] USER[hduser] GROUP[-] TOKEN[] APP[java-main-wf] > > JOB[0000004-150906115439360-oozie-hdus-W] > > ACTION[0000004-150906115439360-oozie-hdus-W@java-node] Next Retry, > Attempt > > Number [2] in [60,000] milliseconds > > 2015-09-06 20:16:28,022 INFO ActionStartXCommand:541 - > > SERVER[akamal-laptop] USER[hduser] GROUP[-] TOKEN[] APP[java-main-wf] > > JOB[0000004-150906115439360-oozie-hdus-W] > > ACTION[0000004-150906115439360-oozie-hdus-W@java-node] Start action > > [0000004-150906115439360-oozie-hdus-W@java-node] with user-retry state : > > userRetryCount [0], userRetryMax [0], userRetryInterval [10] > > > > -- > > JAK , > > > > Ahmed Kamal > > Junior Big Data Engineer , BadrIT > > -- JAK , Ahmed Kamal Junior Big Data Engineer , BadrIT
