Thanks,
I delete 'export HADOOP_CONF_DIR=/software/conf/$USER/hbase_hadoop_conf ' from
hbase-env.sh
and add 'export
HBASE_CLASSPATH=${HBASE_CLASSPATH}:/software/conf/$USER/hbase_hadoop_conf '
The Step1 Has run successfully ,but the step 2:'Extract Fact Table Distinct
Columns' still use the hdfs of hbase,why?
________________________________
yes; Liu Ze, please check your hive or HDFS client settings; On 10/13/15, 2:11
PM, "DroopyHoo" wrote: >Kylin creates intermediate table by shell command, you
could check >your local hive client configuration. > >在 15/10/13 上午8:42, LIU Ze
(刘则) 写道: >> hi all, >> >> Hive use hdfs: A, Hbase use hdfs :B >> when create
table the location is used hdfs B, why? >> >> >> CREATE EXTERNAL TABLE IF NOT
EXISTS
>>kylin_intermediate_cubletest_20150101000000_20151031000000_e7f72ad9_77ca_
>>4b1f_a8f8_fb54c1475885 >> ( >> FFAN_WEB_VISIT_DAY_DATEKEY string >>
,FFAN_DIM_DAY_MONTHKEY string >> ,FFAN_WEB_VISIT_DAY_VISIT_TYPE string >>
,FFAN_WEB_VISIT_DAY_MEMBER_TYPE string >> ,FFAN_WEB_VISIT_DAY_SOURCE string >>
,FFAN_WEB_VISIT_DAY_PV int >> ,FFAN_WEB_VISIT_DAY_UV int >>
,FFAN_WEB_VISIT_DAY_VISIT_CNT int >> ,FFAN_WEB_VISIT_DAY_IP_CNT int >> ) >> ROW
FORMAT DELIMITED FIELDS TERMINATED BY '\177' >> STORED AS SEQUENCEFILE >>
LOCATION
>>'/tmp/kylin/kylin_metadata//kylin-e7f72ad9-77ca-4b1f-a8f8-fb54c1475885/ky
>>lin_intermediate_cubletest_20150101000000_20151031000000_e7f72ad9_77ca_4b
>>1f_a8f8_fb54c1475885'; >> >> ________________________________ > >-- >-------
>Wei Hu >