yes, hbase changed this between 1.1.0 and 1.1.1; please help to update the
JIRA with your finding; thanks Benjamin!

On 10/20/15, 1:56 PM, "Benjamin Leonhardi" <[email protected]>
wrote:

>So hadoop.tmp.dir did not work because HBASE added a new parameter in the
>newest version
>
>hbase.fs.tmp.dir
>
>
>I added this to the kylin.properties file and it now runs through. I will
>add this comment to the jira if I can
>
>
>
>On 10/20/15, 2:44 PM, "Benjamin Leonhardi" <[email protected]>
>wrote:
>
>>Hmm although adding hadoop.tmp.dir does not seem to work for me. I added
>>it everywhere in hadoop (core-site,mapred-site and hbase-site 0 and in
>>the kylin job config as well. Still same error. I will have a deeper look
>>
>>
>>
>>On 10/20/15, 1:58 PM, "Benjamin Leonhardi" <[email protected]>
>>wrote:
>>
>>>Thanks a lot. I swear I googled it :-)
>>>
>>>
>>>
>>>
>>>On 10/20/15, 1:56 PM, "Shi, Shaofeng" <[email protected]> wrote:
>>>
>>>>Hi Benjamin, please check if this JIRA matches your problem:
>>>>https://issues.apache.org/jira/browse/KYLIN-1083
>>>>
>>>>On 10/20/15, 12:53 PM, "Benjamin Leonhardi"
>>>><[email protected]>
>>>>wrote:
>>>>
>>>>>Hello Everybody,
>>>>>
>>>>>Thanks for all the help/
>>>>>
>>>>>I successfully build Kylin on HDP2.3 with the new updates to the
>>>>>branch
>>>>>and by changing some dependencies in the pom file ( got a HCAT mapred
>>>>>version conflict before ) When I changed the entries in the pom file
>>>>>to
>>>>>the exact version of HDP2.3 it worked.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>><!-- Hadoop versions -->
>>>>>        <hadoop2.version>2.7.1</hadoop2.version>
>>>>>        <yarn.version>2.7.1</yarn.version>
>>>>>        <zookeeper.version>3.4.6</zookeeper.version>
>>>>>        <hive.version>1.2.1</hive.version>
>>>>>        <hive-hcatalog.version>1.2.1</hive-hcatalog.version>
>>>>>        <hbase-hadoop2.version>1.1.1</hbase-hadoop2.version>
>>>>>        <curator.version>2.7.1</curator.version>
>>>>>
>>>>>
>>>>>However now I get errors in the test and also when I create a new cube
>>>>>from scratch during the Hfile creation.
>>>>>
>>>>>Any idea what this could be? During the test run I thought it might
>>>>>be a
>>>>>bad testcase but it also happens when I build a new cube from scratch.
>>>>>
>>>>>Parameters:
>>>>>
>>>>> -conf /kylin/kylin-1.1-incubating-SNAPSHOT/conf/kylin_job_conf.xml
>>>>>-cubename aggtest -input
>>>>>/kylin/kylin_metadata/kylin-0669bcc0-2a42-4189-a7c8-6abbb533da8c/aggte
>>>>>st/c
>>>>>uboid/* -output
>>>>>hdfs://sandbox.hortonworks.com:8020/kylin/kylin_metadata/kylin-0669bcc
>>>>>0-2a
>>>>>42-4189-a7c8-6abbb533da8c/aggtest/hfile -htablename KYLIN_0PV292NH5B
>>>>>-jobname Kylin_HFile_Generator_aggtest_Step
>>>>>
>>>>>
>>>>>Error message :
>>>>>
>>>>>
>>>>>java.lang.IllegalArgumentException: Can not create a Path from a null
>>>>>string
>>>>>   at org.apache.hadoop.fs.Path.checkPathArg(Path.java:122)
>>>>>   at org.apache.hadoop.fs.Path.<init>(Path.java:134)
>>>>>   at org.apache.hadoop.fs.Path.<init>(Path.java:88)
>>>>>   at 
>>>>>org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitio
>>>>>ner(
>>>>>HFileOutputFormat2.java:591)
>>>>>   at 
>>>>>org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncremen
>>>>>talL
>>>>>oad(HFileOutputFormat2.java:440)
>>>>>   at 
>>>>>org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncremen
>>>>>talL
>>>>>oad(HFileOutputFormat2.java:405)
>>>>>   at 
>>>>>org.apache.hadoop.hbase.mapreduce.HFileOutputFormat.configureIncrement
>>>>>alLo
>>>>>ad(HFileOutputFormat.java:91)
>>>>>   at 
>>>>>org.apache.kylin.job.hadoop.cube.CubeHFileJob.run(CubeHFileJob.java:86
>>>>>)
>>>>>   at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>>   at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>>>   at 
>>>>>org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExecut
>>>>>able
>>>>>.java:113)
>>>>>   at 
>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExec
>>>>>utab
>>>>>le.java:107)
>>>>>   at 
>>>>>org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(Default
>>>>>Chai
>>>>>nedExecutable.java:51)
>>>>>   at 
>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExec
>>>>>utab
>>>>>le.java:107)
>>>>>   at 
>>>>>org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(De
>>>>>faul
>>>>>tScheduler.java:130)
>>>>>   at 
>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
>>>>>ava:
>>>>>1145)
>>>>>   at 
>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.
>>>>>java
>>>>>:615)
>>>>>   at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>>result code:2
>>>>>
>>>>>
>>>>>
>>>>>On 10/9/15, 1:11 PM, "Benjamin Leonhardi" <[email protected]>
>>>>>wrote:
>>>>>
>>>>>>Hello Everybody,
>>>>>>
>>>>>>We figured out the 1.2 coded comes from the first dependency. Will
>>>>>>try
>>>>>>and update.
>>>>>>
>>>>>>Thanks for your help.
>>>>>>
>>>>>>Ben
>>>>>>
>>>>>>
>>>>>>
>>>>>>On 10/6/15, 12:47 AM, "周千昊" <[email protected]> wrote:
>>>>>>
>>>>>>>Hi, Benjamin,
>>>>>>>      It is a bug, a jira ticket is created as follow:
>>>>>>>https://issues.apache.org/jira/browse/KYLIN-1059
>>>>>>>      and it should have been fixed. since I don't have the hdp 2.3
>>>>>>>env
>>>>>>>currently, it would be appreciated if you can help to verify with
>>>>>>>the
>>>>>>>latest code(branch 1.x-HBase1.x)
>>>>>>>
>>>>>>>Luke Han <[email protected]>于2015年10月5日周一 下午8:35写道:
>>>>>>>
>>>>>>>> Hi Benjamin,
>>>>>>>>     Did you generate binary package by yourself with below
>>>>>>>>instruction? or
>>>>>>>> just compile war/jar from maven?
>>>>>>>>
>>>>>>>>     
>>>>>>>>http://kylin.incubator.apache.org/development/howto_package.html
>>>>>>>> <http://kylin.incubator.apache.org/development/howto_package.html>
>>>>>>>>
>>>>>>>>     Thanks.
>>>>>>>>
>>>>>>>>
>>>>>>>> Best Regards!
>>>>>>>> ---------------------
>>>>>>>>
>>>>>>>> Luke Han
>>>>>>>>
>>>>>>>> On Thu, Oct 1, 2015 at 1:30 PM, Benjamin Leonhardi <
>>>>>>>> [email protected]> wrote:
>>>>>>>>
>>>>>>>> > Hello All,
>>>>>>>> >
>>>>>>>> > I try to get Kylin to run with HDP 2.3 sandbox image. Using the
>>>>>>>>HBase1.x
>>>>>>>> > branch.
>>>>>>>> >
>>>>>>>> > https://github.com/apache/incubator-kylin/tree/1.x-HBase1.x
>>>>>>>> >
>>>>>>>> > It compiles fine and the tests run through until they reach the
>>>>>>>>job
>>>>>>>> > creation. I followed the steps below.
>>>>>>>> >
>>>>>>>> > http://kylin.incubator.apache.org/development/dev_env.html
>>>>>>>> >
>>>>>>>> > When I run the tests he reaches the job part and then fails
>>>>>>>>because
>>>>>>>>he
>>>>>>>> > misses a method in commons-codec.jar. The method is
>>>>>>>>encodeBase64(byte[],
>>>>>>>> > isChunk boolean, isUrlSafe boolean) and it is in
>>>>>>>>commons-codec1.4+
>>>>>>>>but
>>>>>>>> not
>>>>>>>> > in older versions. I checked but my environment only has codec
>>>>>>>>jars
>>>>>>>>that
>>>>>>>> > are newer.
>>>>>>>> >
>>>>>>>> > However when I build kylin he for some reason compiled an older
>>>>>>>>codec
>>>>>>>> into
>>>>>>>> > the kylin-job-1.1-incubating-SNAPSHOT-job.jar. If I unzip this I
>>>>>>>>see a
>>>>>>>> > Base64 class that only has the method encodeBase64(byte[],
>>>>>>>>isChunk
>>>>>>>> boolean
>>>>>>>> > ) but not the urlsafe method. ( so I suppose codec 1.3 or older
>>>>>>>>)  I
>>>>>>>> tried
>>>>>>>> > to check the pom.xml because I thought I might find an older
>>>>>>>>codec
>>>>>>>> > dependency inside but did not find anything.
>>>>>>>> >
>>>>>>>> > I also have to say I am not the best Maven person so I might be
>>>>>>>> > overlooking something easy. One question I would have is why he
>>>>>>>>compiles
>>>>>>>> > the code into the -job.jar at all.
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > Any help welcome
>>>>>>>> >
>>>>>>>> > Best regards,
>>>>>>>> >
>>>>>>>> > Benjamin
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > Starting:
>>>>>>>> > 
>>>>>>>>Kylin_Fact_Distinct_Columns_test_kylin_cube_without_slr_empty_Step
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,166][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> append
>>>>>>>> > job jar:
>>>>>>>> > 
>>>>>>>>/root/kylin_build/incubator-kylin/job/../job/target/kylin-job-1.1-i
>>>>>>>>ncu
>>>>>>>> > bating-SNAPSHOT-job.jar
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,169][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> append
>>>>>>>> > kylin.hive.dependency: null and kylin.hbase.dependency: null to
>>>>>>>> > mapreduce.appli
>>>>>>>> > cation.classpath
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,167][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> append
>>>>>>>> > job jar:
>>>>>>>> > 
>>>>>>>>/root/kylin_build/incubator-kylin/job/../job/target/kylin-job-1.1-i
>>>>>>>>ncu
>>>>>>>> > bating-SNAPSHOT-job.jar
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,170][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> append
>>>>>>>> > kylin.hive.dependency: null and kylin.hbase.dependency: null to
>>>>>>>> > mapreduce.appli
>>>>>>>> > cation.classpath
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,170][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> Hadoop
>>>>>>>> > job classpath is:
>>>>>>>> >
>>>>>>>> 
>>>>>>>>/tmp/kylin/*,$HADOOP_CONF_DIR,/usr/hdp/2.3.0.0-2530/hbase/lib/hbase
>>>>>>>>-com
>>>>>>>>mon.jar,/usr/hdp/current/hive-client/conf/,/usr/hdp/2.3.0.0-2530/hi
>>>>>>>>ve/l
>>>>>>>>ib/hive-metastore.jar,/usr/hdp/2.3.0.0-2530/hive/lib/hive-exec.jar,
>>>>>>>>/usr
>>>>>>>>/hdp/2.3.0.0-2530/hive-hcatalog/share/hcatalog/*,$PWD/mr-framework/
>>>>>>>>hado
>>>>>>>>op/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/m
>>>>>>>>apre
>>>>>>>>duce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-f
>>>>>>>>rame
>>>>>>>>work/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/shar
>>>>>>>>e/ha
>>>>>>>>doop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/m
>>>>>>>>r-fr
>>>>>>>>amework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/h
>>>>>>>>adoo
>>>>>>>>p/hdfs/lib/*:/usr/hdp/2.3.0.0-2530/hadoop/lib/hadoop-lzo-0.6.0.2.3.
>>>>>>>>0.0-
>>>>>>>>2530.jar:/etc/hadoop/conf/secure
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,169][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> Hadoop
>>>>>>>> > job classpath is:
>>>>>>>> >
>>>>>>>> 
>>>>>>>>/tmp/kylin/*,$HADOOP_CONF_DIR,/usr/hdp/2.3.0.0-2530/hbase/lib/hbase
>>>>>>>>-com
>>>>>>>>mon.jar,/usr/hdp/current/hive-client/conf/,/usr/hdp/2.3.0.0-2530/hi
>>>>>>>>ve/l
>>>>>>>>ib/hive-metastore.jar,/usr/hdp/2.3.0.0-2530/hive/lib/hive-exec.jar,
>>>>>>>>/usr
>>>>>>>>/hdp/2.3.0.0-2530/hive-hcatalog/share/hcatalog/*,$PWD/mr-framework/
>>>>>>>>hado
>>>>>>>>op/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/m
>>>>>>>>apre
>>>>>>>>duce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-f
>>>>>>>>rame
>>>>>>>>work/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/shar
>>>>>>>>e/ha
>>>>>>>>doop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/m
>>>>>>>>r-fr
>>>>>>>>amework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/h
>>>>>>>>adoo
>>>>>>>>p/hdfs/lib/*:/usr/hdp/2.3.0.0-2530/hadoop/lib/hadoop-lzo-0.6.0.2.3.
>>>>>>>>0.0-
>>>>>>>>2530.jar:/etc/hadoop/conf/secure
>>>>>>>> > L4J [2015-10-01
>>>>>>>>03:51:24,229][WARN][org.apache.hadoop.hive.conf.HiveConf]
>>>>>>>> > - HiveConf of name hive.heapsize does not exist
>>>>>>>> > L4J [2015-10-01
>>>>>>>>03:51:24,230][WARN][org.apache.hadoop.hive.conf.HiveConf]
>>>>>>>> > - HiveConf of name hive.server2.enable.impersonation does not
>>>>>>>>exist
>>>>>>>> > L4J [2015-10-01
>>>>>>>>03:51:24,236][WARN][org.apache.hadoop.hive.conf.HiveConf]
>>>>>>>> > - HiveConf of name hive.heapsize does not exist
>>>>>>>> > L4J [2015-10-01
>>>>>>>>03:51:24,237][WARN][org.apache.hadoop.hive.conf.HiveConf]
>>>>>>>> > - HiveConf of name hive.server2.enable.impersonation does not
>>>>>>>>exist
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,265][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> > tempMetaFileString is : null
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,266][INFO][org.apache.kylin.job.hadoop.AbstractHadoopJob]
>>>>>>>>-
>>>>>>>> > tempMetaFileString is : null
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,267][ERROR][org.apache.kylin.job.execution.AbstractExecuta
>>>>>>>>ble]
>>>>>>>> -
>>>>>>>> > error running Executable
>>>>>>>> > java.lang.NoSuchMethodError:
>>>>>>>> > org.apache.commons.codec.binary.Base64.encodeBase64([BZZ)[B
>>>>>>>> >         at
>>>>>>>> > 
>>>>>>>>org.apache.hive.hcatalog.common.HCatUtil.encodeBytes(HCatUtil.java:
>>>>>>>>125)
>>>>>>>> >         at
>>>>>>>> > 
>>>>>>>>org.apache.hive.hcatalog.common.HCatUtil.serialize(HCatUtil.java:10
>>>>>>>>4)
>>>>>>>> >         at
>>>>>>>> > 
>>>>>>>>org.apache.hive.hcatalog.common.HCatUtil.getHiveConf(HCatUtil.java:
>>>>>>>>585)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(
>>>>>>>>Init
>>>>>>>>ializeInput.java:100)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(Initial
>>>>>>>>izeI
>>>>>>>>nput.java:86)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInp
>>>>>>>>utFo
>>>>>>>>rmat.java:95)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInp
>>>>>>>>utFo
>>>>>>>>rmat.java:51)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.setupMapper
>>>>>>>>(Fac
>>>>>>>>tDistinctColumnsJob.java:101)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.run(FactDis
>>>>>>>>tinc
>>>>>>>>tColumnsJob.java:77)
>>>>>>>> >         at
>>>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>>>>> >         at
>>>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExe
>>>>>>>>cuta
>>>>>>>>ble.java:113)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractE
>>>>>>>>xecu
>>>>>>>>table.java:107)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(Defa
>>>>>>>>ultC
>>>>>>>>hainedExecutable.java:51)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractE
>>>>>>>>xecu
>>>>>>>>table.java:107)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run
>>>>>>>>(Def
>>>>>>>>aultScheduler.java:130)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecuto
>>>>>>>>r.ja
>>>>>>>>va:1145)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecut
>>>>>>>>or.j
>>>>>>>>ava:615)
>>>>>>>> >         at java.lang.Thread.run(Thread.java:745)
>>>>>>>> > L4J [2015-10-01
>>>>>>>> > 
>>>>>>>>03:51:24,267][ERROR][org.apache.kylin.job.execution.AbstractExecuta
>>>>>>>>ble]
>>>>>>>> -
>>>>>>>> > error running Executable
>>>>>>>> > java.lang.NoSuchMethodError:
>>>>>>>> > org.apache.commons.codec.binary.Base64.encodeBase64([BZZ)[B
>>>>>>>> >         at
>>>>>>>> > 
>>>>>>>>org.apache.hive.hcatalog.common.HCatUtil.encodeBytes(HCatUtil.java:
>>>>>>>>125)
>>>>>>>> >         at
>>>>>>>> > 
>>>>>>>>org.apache.hive.hcatalog.common.HCatUtil.serialize(HCatUtil.java:10
>>>>>>>>4)
>>>>>>>> >         at
>>>>>>>> > 
>>>>>>>>org.apache.hive.hcatalog.common.HCatUtil.getHiveConf(HCatUtil.java:
>>>>>>>>585)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(
>>>>>>>>Init
>>>>>>>>ializeInput.java:100)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(Initial
>>>>>>>>izeI
>>>>>>>>nput.java:86)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInp
>>>>>>>>utFo
>>>>>>>>rmat.java:95)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInp
>>>>>>>>utFo
>>>>>>>>rmat.java:51)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.setupMapper
>>>>>>>>(Fac
>>>>>>>>tDistinctColumnsJob.java:101)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.run(FactDis
>>>>>>>>tinc
>>>>>>>>tColumnsJob.java:77)
>>>>>>>> >         at
>>>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>>>>> >         at
>>>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExe
>>>>>>>>cuta
>>>>>>>>ble.java:113)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractE
>>>>>>>>xecu
>>>>>>>>table.java:107)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(Defa
>>>>>>>>ultC
>>>>>>>>hainedExecutable.java:51)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractE
>>>>>>>>xecu
>>>>>>>>table.java:107)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run
>>>>>>>>(Def
>>>>>>>>aultScheduler.java:130)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecuto
>>>>>>>>r.ja
>>>>>>>>va:1145)
>>>>>>>> >         at
>>>>>>>> >
>>>>>>>> 
>>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecut
>>>>>>>>or.j
>>>>>>>>ava:615)
>>>>>>>> >
>>>>>>>> >         at java.lang.Thread.run(Thread.java:745)
>>>>>>>> >
>>>>>>>>
>>>>
>>>>

Reply via email to