>From error log I think it was caused by being short of jars such as hive-hcatalog, I think it is because you did not deploy your hive env in every node of your hadoop cluster as we did, We solved this problem by modify the source code , upload or add tmpjars of those dependency jars and files before every mapreduce job was submitted.
JIRA ticket is here : https://issues.apache.org/jira/browse/KYLIN-1021 hope it can help you ~ 2015-10-16 15:52 GMT+08:00 LIU Ze (刘则) <[email protected]>: > hi all: > it make a error in step 2 > > kylin.log: > ________________________________ > [pool-5-thread-9]:[2015-10-16 > 15:47:06,398][ERROR][org.apache.kylin.job.common.HadoopCmdOutput.updateJobCounter(HadoopCmdOutput.java:100)] > - java.io.IOException: Unknown Job job_1444723293631_9288 > at > org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGetJob(HistoryClientService.java:218) > at > org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.getCounters(HistoryClientService.java:232) > at > org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServiceImpl.getCounters(MRClientProtocolPBServiceImpl.java:159) > at > org.apache.hadoop.yarn.proto.MRClientProtocol$MRClientProtocolService$2.callBlockingMethod(MRClientProtocol.java:281) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) > > > yarn logs -applicationId application_1444723293631_9286 : > > 2015-10-16 15:43:07,543 INFO [main] > org.apache.hadoop.conf.Configuration.deprecation: session.id is > deprecated. Instead, use dfs.metrics.session-id > 2015-10-16 15:43:07,826 INFO [main] > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output > Committer Algorithm version is 1 > 2015-10-16 15:43:07,834 INFO [main] org.apache.hadoop.mapred.Task: Using > ResourceCalculatorProcessTree : [ ] > 2015-10-16 15:43:07,901 WARN [main] org.apache.hadoop.mapred.YarnChild: > Exception running child : java.lang.RuntimeException: > java.lang.ClassNotFoundException: Class > org.apache.hive.hcatalog.mapreduce.HCatInputFormat not found > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195) > at > org.apache.hadoop.mapreduce.task.JobContextImpl.getInputFormatClass(JobContextImpl.java:174) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:749) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) > Caused by: java.lang.ClassNotFoundException: Class > org.apache.hive.hcatalog.mapreduce.HCatInputFormat not found > at > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101) > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193) > ... 8 more > >
