Hi all,
i am encountering a problem with multiple hadoop cluster.
kylin submit job to yarn on one hdfs, but my fact table is on other hdfs.
Two hadoop clusters use the same mysql to store metadata.
so when i build cube, the first step to create intermediate table , and insert
data from fact table.
but i can't access the fact table in kylin's hive.
for example , the first step as below
"kylin_intermediate_cube8_20160301000000_20160413000000 SELECT
PARTNER_USR_DOC_BASIC_INFO_FT0_S.PHONE_PROVINCE_IND
FROM WLT_PARTNER.PARTNER_USR_DOC_BASIC_INFO_FT0_S as
PARTNER_USR_DOC_BASIC_INFO_FT0_S
WHERE (PARTNER_USR_DOC_BASIC_INFO_FT0_S.PT_LOG_D >= '2016-03-01' AND
PARTNER_USR_DOC_BASIC_INFO_FT0_S.PT_LOG_D < '2016-04-13')"
Table "PARTNER_USR_DOC_BASIC_INFO_FT0_S" locate
"hdfs://hadoop2NameNode/wlt_partner/PARTNER_USR_DOC_BASIC_INFO_FT0_S"
but "kylin_intermediate_cube8_20160301000000_20160413000000" locate
"hdfs://bihbasemaster/"
they are different clusters.
The current situation is there will not be any error in WEBUI at step 1,
When cube done, there is nothing in Htable, so What can i do?