Hello Alex, Interesting; We didn't observe such an issue. Can you confirm your hive table has the data, instead of an input error? Does the problem get solved after setting "mapreduce.job.outputformat.class"?
Thanks for the feedback! Best regards, Shaofeng Shi 史少锋 Apache Kylin PMC Work email: [email protected] Kyligence Inc: https://kyligence.io/ Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html Join Kylin user mail group: [email protected] Join Kylin dev mail group: [email protected] mailpig <[email protected]> 于2019年2月20日周三 上午11:18写道: > In kylin-2.5.2, the result hbase table is always table is always empty > when I > build cube with spark. > I found that the step "Load HFile to HBase Table" has some warn log: > /2019-01-27 00:49:30,067 WARN [Scheduler 448149092 Job > 89a25959-e12d-7a5e-0ecb-80c978533eab-6419] > mapreduce.LoadIncrementalHFiles:204 : Skipping non-directory > > hdfs://test/kylin/kylin_metadata/kylin-89a25959-e12d-7a5e-0ecb-80c978533eab/test_UUID_spark/hfile/_SUCCESS > 2019-01-27 00:49:30,068 WARN [Scheduler 448149092 Job > 89a25959-e12d-7a5e-0ecb-80c978533eab-6419] > mapreduce.LoadIncrementalHFiles:204 : Skipping non-directory > > hdfs://test/kylin/kylin_metadata/kylin-89a25959-e12d-7a5e-0ecb-80c978533eab/test_UUID_spark/hfile/part-r-00000 > 2019-01-27 00:49:30,068 WARN [Scheduler 448149092 Job > 89a25959-e12d-7a5e-0ecb-80c978533eab-6419] > mapreduce.LoadIncrementalHFiles:204 : Skipping non-directory > > hdfs://test/kylin/kylin_metadata/kylin-89a25959-e12d-7a5e-0ecb-80c978533eab/test_UUID_spark/hfile/part-r-00001/ > > After read the source code, I found the step "Convert Cuboid Data to HFile" > with spark has bug. The above step's outputdir should has subdirectory with > column family. Indeed, SparkCubeHFile must set > mapreduce.job.outputformat.class with HFileOutputFormat2.class. > > Please check if I am correct! > > -- > Sent from: http://apache-kylin.74782.x6.nabble.com/ >
