Re: Hbase table is always empty when build with spark

2019-03-19 Thread ShaoFeng Shi
Hi Alex, Could you please report a JIRA to Kylin? or send a Pull request if you already have a hot-fix. Thank you! Best regards, Shaofeng Shi 史少锋 Apache Kylin PMC Email: shaofeng...@apache.org Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html Join Kylin user mail group:

Re: Hbase table is always empty when build with spark

2019-02-25 Thread mailpig
Sure, hive table is not empty and the output directory of hfile also has data. After set the mapreduce.job.outputformat.class in the job config, load hfile to hbase is success. Besides that I found the source code has

Re: Hbase table is always empty when build with spark

2019-02-24 Thread ShaoFeng Shi
Hello Alex, Interesting; We didn't observe such an issue. Can you confirm your hive table has the data, instead of an input error? Does the problem get solved after setting "mapreduce.job.outputformat.class"? Thanks for the feedback! Best regards, Shaofeng Shi 史少锋 Apache Kylin PMC Work email:

Hbase table is always empty when build with spark

2019-02-19 Thread mailpig
In kylin-2.5.2, the result hbase table is always table is always empty when I build cube with spark. I found that the step "Load HFile to HBase Table" has some warn log: /2019-01-27 00:49:30,067 WARN [Scheduler 448149092 Job 89a25959-e12d-7a5e-0ecb-80c978533eab-6419]