I modified the 1.x-HBase1.x branch, such that the "hadoop.tmp.dir" and
"hbase.fs.tmp.dir"
are set to "/tmp" if missing. This show stopper should not happen again.

On Fri, Oct 23, 2015 at 1:18 PM, Li Yang <[email protected]> wrote:

> To sum up what's in https://issues.apache.org/jira/browse/KYLIN-953. To
> resolve the issue, you need one of the two hadoop config below in your site
> xmls
>
> - hadoop.tmp.dir    (for hbase 1.1.0 and before)
> - hbase.fs.tmp.dir   (for hbase 1.1.1 and after)
>
> On Wed, Oct 21, 2015 at 1:07 AM, Shailesh Dangi <[email protected]>
> wrote:
>
>> I'm now hitting a document issue. Trying to apply the fix suggested in the
>> JIRA
>>
>> https://issues.apache.org/jira/browse/KYLIN-953
>>
>> when cube job run at the "Convert Cuboid Data to HFile" step, throws an
>> error like bellow:
>> [pool-5-thread-8]:[2015-08-18 09:43:15,854][ERROR]
>> [org.apache.kylin.job.hadoop.cube.CubeHFileJob.run(CubeHFileJob.java:98)]
>> -
>> error in CubeHFileJ
>> ob
>> java.lang.IllegalArgumentException: Can not create a Path from a null
>> string
>> at org.apache.hadoop.fs.Path.checkPathArg(Path.java:123)
>>
>> On Tue, Oct 20, 2015 at 10:49 AM, sdangi <[email protected]> wrote:
>>
>> > Hi Luke -- I will be looking into this later today.  But here is the
>> > progress (or lack there of), so far
>> >
>> > 1)   cd /home/worker1/kylin/1.x-HBase1.x
>> >
>> > 2)   [root@worker1 1.x-HBase1.x]# git clone -b 1.x-HBase1.x
>> > https://github.com/apache/incubator-kylin.git .
>> >
>> > [root@worker1 1.x-HBase1.x]# ls -ltr
>> >
>> > total 88
>> >
>> > -rw-r--r--  1 root root   849 Oct 20 10:13 README.md
>> >
>> > -rw-r--r--  1 root root   180 Oct 20 10:13 NOTICE
>> >
>> > -rw-r--r--  1 root root 12401 Oct 20 10:13 LICENSE
>> >
>> > -rw-r--r--  1 root root  7290 Oct 20 10:13 KEYS
>> >
>> > -rw-r--r--  1 root root   539 Oct 20 10:13 DISCLAIMER
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 atopcalcite
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 common
>> >
>> > drwxr-xr-x  2 root root  4096 Oct 20 10:13 bin
>> >
>> > drwxr-xr-x  2 root root    54 Oct 20 10:13 conf
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 cube
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 dictionary
>> >
>> > drwxr-xr-x  2 root root    23 Oct 20 10:13 deploy
>> >
>> > drwxr-xr-x  2 root root    22 Oct 20 10:13 docs
>> >
>> > drwxr-xr-x  4 root root    62 Oct 20 10:13 examples
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 invertedindex
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 jdbc
>> >
>> > drwxr-xr-x  4 root root    96 Oct 20 10:13 job
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 metadata
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 monitor
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 query
>> >
>> > -rw-r--r--  1 root root 39837 Oct 20 10:13 pom.xml
>> >
>> > drwxr-xr-x  2 root root    98 Oct 20 10:13 script
>> >
>> > drwxr-xr-x  4 root root    69 Oct 20 10:13 server
>> >
>> > drwxr-xr-x  3 root root    17 Oct 20 10:13 src
>> >
>> > drwxr-xr-x  4 root root    46 Oct 20 10:13 storage
>> >
>> > drwxr-xr-x  3 root root  4096 Oct 20 10:13 webapp
>> >
>> > drwxr-xr-x 16 root root  4096 Oct 20 10:13 website
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > Build usingMaven
>> >
>> > INFO] --- maven-assembly-plugin:2.5.5:single (make-assembly) @
>> > kylin-monitor ---
>> >
>> > [WARNING] Artifact:
>> > org.apache.kylin:kylin-monitor:jar:1.1-incubating-SNAPSHOT references
>> the
>> > same file as the assembly destination file. Moving it to a temporary
>> > location for inclusion.
>> >
>> > [INFO] Building jar:
>> >
>> >
>> /home/worker1/kylin/1.x-HBase1.x/monitor/target/kylin-monitor-1.1-incubating-SNAPSHOT.jar
>> >
>> > [WARNING] Configuration options: 'appendAssemblyId' is set to false, and
>> > 'classifier' is missing.
>> >
>> > Instead of attaching the assembly file:
>> >
>> >
>> /home/worker1/kylin/1.x-HBase1.x/monitor/target/kylin-monitor-1.1-incubating-SNAPSHOT.jar,
>> > it will become the file for main project artifact.
>> >
>> > NOTE: If multiple descriptors or descriptor-formats are provided for
>> this
>> > project, the value of this file will be non-deterministic!
>> >
>> > [WARNING] Replacing pre-existing project main-artifact file:
>> >
>> >
>> /home/worker1/kylin/1.x-HBase1.x/monitor/target/archive-tmp/kylin-monitor-1.1-incubating-SNAPSHOT.jar
>> >
>> > with assembly file:
>> >
>> >
>> /home/worker1/kylin/1.x-HBase1.x/monitor/target/kylin-monitor-1.1-incubating-SNAPSHOT.jar
>> >
>> > [INFO]
>> >
>> > [INFO] --- maven-jar-plugin:2.4:test-jar (default) @ kylin-monitor ---
>> >
>> > [INFO] Building jar:
>> >
>> >
>> /home/worker1/kylin/1.x-HBase1.x/monitor/target/kylin-monitor-1.1-incubating-SNAPSHOT-tests.jar
>> >
>> > [INFO]
>> > ------------------------------------------------------------------------
>> >
>> > [INFO] Reactor Summary:
>> >
>> > [INFO]
>> >
>> > [INFO] Kylin:HadoopOLAPEngine ............................. SUCCESS [
>> > 0.864 s]
>> >
>> > [INFO] Kylin:AtopCalcite .................................. SUCCESS [
>> > 5.439 s]
>> >
>> > [INFO] Kylin:Common ....................................... SUCCESS [
>> > 7.231 s]
>> >
>> > [INFO] Kylin:Metadata ..................................... SUCCESS [
>> > 1.428 s]
>> >
>> > [INFO] Kylin:Dictionary ................................... SUCCESS [
>> > 1.559 s]
>> >
>> > [INFO] Kylin:Cube ......................................... SUCCESS [
>> > 2.344 s]
>> >
>> > [INFO] Kylin:InvertedIndex ................................ SUCCESS [
>> > 0.523 s]
>> >
>> > [INFO] Kylin:Job .......................................... SUCCESS [
>> > 3.889 s]
>> >
>> > [INFO] Kylin:Storage ...................................... SUCCESS [
>> > 2.018 s]
>> >
>> > [INFO] Kylin:Query ........................................ SUCCESS [
>> > 1.278 s]
>> >
>> > [INFO] Kylin:JDBC ......................................... SUCCESS [
>> > 1.901 s]
>> >
>> > [INFO] Kylin:RESTServer ................................... SUCCESS [
>> > 8.819 s]
>> >
>> > [INFO] Kylin:Monitor ...................................... SUCCESS [
>> > 1.038 s]
>> >
>> > [INFO]
>> > ------------------------------------------------------------------------
>> >
>> > [INFO] BUILD SUCCESS
>> >
>> > [INFO]
>> > ------------------------------------------------------------------------
>> >
>> > [INFO] Total time: 38.658 s
>> >
>> > [INFO] Finished at: 2015-10-20T10:17:54-04:00
>> >
>> > [INFO] Final Memory: 132M/2053M
>> >
>> > [INFO]
>> > ------------------------------------------------------------------------
>> >
>> >
>> >
>> >
>> >
>> >  Imported the sample cube and ran job.  It goes up to the 13th step and
>> > fails build Hbase tables - seems like a permission issue.
>> >
>> >
>> > ==> kylin.log <==
>> >
>> > [pool-7-thread-10]:[2015-10-20
>> >
>> >
>> 10:44:58,078][INFO][org.apache.kylin.job.tools.DeployCoprocessorCLI.deployCoprocessor(DeployCoprocessorCLI.java:99)]
>> > - hbase table [B@a092af6 deployed with coprocessor.
>> >
>> > usage: CreateHTableJob
>> >
>> >  -cubename <name>            Cube name. For exmaple, flat_item_cube
>> >
>> >  -htablename <htable name>   HTable name
>> >
>> >  -input <path>               Partition file path.
>> >
>> > org.apache.hadoop.hbase.DoNotRetryIOException:
>> > org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> >
>> > at
>> >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >
>> > at
>> >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >
>> > at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:636)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:557)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:150)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:62)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
>> >
>> > at
>> >
>> >
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> >
>> > at
>> >
>> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> > Caused by:
>> >
>> >
>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
>> > org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:51086)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1802)
>> >
>> > at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:641)
>> >
>> > at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:637)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>> >
>> > ... 14 more
>> >
>> >
>> > ==> kylin_job.log <==
>> >
>> > [pool-7-thread-10]:[2015-10-20
>> >
>> >
>> 10:44:58,113][ERROR][org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:157)]
>> > - org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > org.apache.hadoop.hbase.DoNotRetryIOException:
>> > org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> >
>> > at
>> >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >
>> > at
>> >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >
>> > at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:636)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:557)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:150)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:62)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
>> >
>> > at
>> >
>> >
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> >
>> > at
>> >
>> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> > Caused by:
>> >
>> >
>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
>> > org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:51086)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1802)
>> >
>> > at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:641)
>> >
>> > at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:637)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>> >
>> > ... 14 more
>> >
>> >
>> > ==> kylin.log <==
>> >
>> > [pool-7-thread-10]:[2015-10-20
>> >
>> >
>> 10:44:58,113][ERROR][org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:157)]
>> > - org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > org.apache.hadoop.hbase.DoNotRetryIOException:
>> > org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> >
>> > at
>> >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >
>> > at
>> >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >
>> > at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:636)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:557)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:150)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:62)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>> >
>> > at
>> >
>> >
>> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
>> >
>> > at
>> >
>> >
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> >
>> > at
>> >
>> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> > Caused by:
>> >
>> >
>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
>> > org.apache.hadoop.hbase.DoNotRetryIOException: Mkdirs failed to create
>> > /tmp/hbase-hbase/local/jars/tmp (exists=false, cwd=file:/home/hbase) Set
>> > hbase.table.sanity.checks to false at conf or table descriptor if you
>> want
>> > to bypass sanity checks
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>> >
>> > at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> >
>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> >
>> > at
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> >
>> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> > at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:51086)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1802)
>> >
>> > at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:641)
>> >
>> > at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:637)
>> >
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>> >
>> > ... 14 more
>> >
>> > 2015-10-20 10:44:58,115 INFO  [pool-7-thread-10]
>> > client.ConnectionManager$HConnectionImplementation: Closing master
>> > protocol: MasterService
>> >
>> > 2015-10-20 10:44:58,115 INFO  [pool-7-thread-10]
>> > client.ConnectionManager$HConnectionImplementation: Closing zookeeper
>> > sessionid=0x15067463b2f001b
>> >
>> > 2015-10-20 10:44:58,145 INFO  [pool-7-thread-10] zookeeper.ZooKeeper:
>> > Session: 0x15067463b2f001b closed
>> >
>> > 2015-10-20 10:44:58,145 INFO  [pool-7-thread-10-EventThread]
>> > zookeeper.ClientCnxn: EventThread shut down
>> >
>> > [pool-7-thread-10]:[2015-10-20
>> >
>> >
>> 10:44:58,161][DEBUG][org.apache.kylin.common.persistence.ResourceStore.putResource(ResourceStore.java:195)]
>> > - Saving resource
>> /execute_output/476fa1ea-25ea-4858-b780-51028c298274-12
>> > (Store kylin_metadata@hbase)
>> >
>> > [pool-7-thread-10]:[2015-10-20
>> >
>> >
>> 10:44:58,172][DEBUG][org.apache.kylin.common.persistence.ResourceStore.putResource(ResourceStore.java:195)]
>> > - Saving resource
>> /execute_output/476fa1ea-25ea-4858-b780-51028c298274-12
>> > (Store kylin_metadata@hbase)
>> >
>> > On Sat, Oct 17, 2015 at 11:13 AM, Luke Han [via Apache Kylin
>> (Incubating)]
>> > <
>> > [email protected]> wrote:
>> >
>> > > Hi Shailesh,
>> > >     If timing is concern, we strongly suggest to downgrade your HBase
>> to
>> > > 0.98 with Kylin. the 1.x branch is not fully tested yet.
>> > >
>> > >     If you still would like to try with HBase 1.x, please clone this
>> > > branch:
>> > >     https://github.com/apache/incubator-kylin/tree/1.x-HBase1.x
>> > >
>> > >     And, then run ./script/package.sh to generate binary package
>> > >     Then copy package from dist folder and install with your Hadoop
>> > > cluster.
>> > >
>> > >      BTW, which distribution you are using now? CDH or HDP?
>> > >
>> > >     Thanks.
>> > >
>> > > Luke
>> > >
>> > >
>> > > Best Regards!
>> > > ---------------------
>> > >
>> > > Luke Han
>> > >
>> > > On Sat, Oct 17, 2015 at 8:29 AM, sdangi <[hidden email]
>> > > <http:///user/SendEmail.jtp?type=node&node=1996&i=0>> wrote:
>> > >
>> > > > Luke/Kylin Team -- Any further updates/guidance you could offer?
>> > Latest
>> > > > clone does not work w/ 1.1 version of HBase.
>> > > >
>> > > > We are working on a time sensitive POC for a financial client and
>> > > > appreciate
>> > > > your responses.
>> > > >
>> > > > Thanks,
>> > > > Regards,
>> > > >
>> > > >
>> > > >
>> > > > --
>> > > > View this message in context:
>> > > >
>> > >
>> >
>> http://apache-kylin-incubating.74782.x6.nabble.com/SAMPLE-CUBE-FAILS-tp1936p1994.html
>> > > > Sent from the Apache Kylin (Incubating) mailing list archive at
>> > > Nabble.com.
>> > > >
>> > >
>> > >
>> > > ------------------------------
>> > > If you reply to this email, your message will be added to the
>> discussion
>> > > below:
>> > >
>> > >
>> >
>> http://apache-kylin-incubating.74782.x6.nabble.com/SAMPLE-CUBE-FAILS-tp1936p1996.html
>> > > To unsubscribe from SAMPLE CUBE FAILS, click here
>> > > <
>> >
>> http://apache-kylin-incubating.74782.x6.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=1936&code=c2RhbmdpQGRhdGFsZW56LmNvbXwxOTM2fDQzMTE2MjM5NA==
>> > >
>> > > .
>> > > NAML
>> > > <
>> >
>> http://apache-kylin-incubating.74782.x6.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml
>> > >
>> > >
>> >
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> >
>> http://apache-kylin-incubating.74782.x6.nabble.com/SAMPLE-CUBE-FAILS-tp1936p2037.html
>> > Sent from the Apache Kylin (Incubating) mailing list archive at
>> Nabble.com.
>> >
>>
>
>

Reply via email to