I just created /tmp directory based on the earlier error, not sure if it we
are really required to do but - it is now failing on "Permission Denied
Error".  I'm logged in as ROOT.

usage: CreateHTableJob
 -cubename <name>            Cube name. For exmaple, flat_item_cube
 -htablename <htable name>   HTable name
 -input <path>               Partition file path.
org.apache.hadoop.hbase.DoNotRetryIOException:
org.apache.hadoop.hbase.DoNotRetryIOException:
/tmp/hbase-hbase/local/jars/tmp/.cff0f893-39e8-427c-b7c1-9d37dbdaf206.kylin-coprocessor-1.1-incubating-SNAPSHOT-1.jar.1444843461862.jar*
(Permission denied)* Set hbase.table.sanity.checks to false at conf or table
descriptor if you want to bypass sanity checks
        at
org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
        at
org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
        at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
        at
org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
        at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
        at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
        at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
        at java.lang.Thread.run(Thread.java:745)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
        at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
        at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
        at
org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
        at
org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
        at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
        at
org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
        at
org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:636)
        at
org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:557)
        at
org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:150)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at
org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:62)
        at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
        at
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
        at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
        at
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)




--
View this message in context: 
http://apache-kylin-incubating.74782.x6.nabble.com/SAMPLE-CUBE-FAILS-tp1936p1965.html
Sent from the Apache Kylin (Incubating) mailing list archive at Nabble.com.

Reply via email to