HI, all

Hive use hdfs: A,   Hbase use hdfs :B
in the step of "Create HTable" ,it will use the jar which in hdfs 
"/tmp/kylin/kylin_metadata/coprocessor/kylin-coprocessor-1.0-incubating-${num}.jar
 "
the jar in hdfs:A but not in hdfs:B ,it make a error


pool-5-thread-10]:[2015-10-28 
17:36:24,371][ERROR][org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:126)]
 - org.apache.hadoop.hbase.DoNotRetryIOException: 
java.io.FileNotFoundException: File does not exist: hdf
s://wanda/tmp/kylin/kylin_metadata/coprocessor/kylin-coprocessor-1.0-incubating-2.jar
        at 
org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1910)
        at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1850)
        at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:2007)
        at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:41479)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2093)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
        at 
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: File does not exist: 
hdfs://wanda/tmp/kylin/kylin_metadata/coprocessor/kylin-coprocessor-1.0-incubating-2.jar
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1110)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2086)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2055)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2031)
        at 
org.apache.hadoop.hbase.util.CoprocessorClassLoader.init(CoprocessorClassLoader.java:168)
        at 
org.apache.hadoop.hbase.util.CoprocessorClassLoader.getClassLoader(CoprocessorClassLoader.java:250)
        at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.testTableCoprocessorAttrs(RegionCoprocessorHost.java:305)
        at 
org.apache.hadoop.hbase.master.HMaster.checkClassLoading(HMaster.java:1998)
        at 
org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1908)
        ... 11 more
________________________________

Reply via email to