答复: intermediate table not found when used the hive view

2017-07-18 Thread jianhui.yi
tion would be, using normal hive table as lookup, that doesn't have this issue. 2017-07-18 15:22 GMT+08:00 jianhui.yi <jianhui...@zhiyoubao.com>: > Hi all, > > I used the Hive view to build the cube, and multiple cube all use the > same view. When the build task is sub

答复: 答复: 答复: Can't cleanup expired data

2017-06-22 Thread jianhui.yi
null,"exec_start_time":1495701151692,"exec_end_time":1495701189412,"exec_wait_time":14,"step_status":"FINISHED","cmd_type":"SHELL_CMD_HADOOP","info":{"byteSizeBytes":"8264","endTime":"1495

答复: Can't cleanup expired data

2017-06-21 Thread jianhui.yi
Hi Shaofeng, This is the logs,also I found that when I purged cube, HIVE table can not be cleanup,for example cube " c_all". 2017-06-21 19:34:52,430 INFO [main StorageCleanupJob:283]: Checking table SLF4J: Class path contains multiple SLF4J bindings. 2017-06-21 19:34:52,430 INFO [main

Can't cleanup expired data

2017-06-20 Thread jianhui.yi
Hi all, Since the upgrade to v2.0 I noticed that the HDFS usage had increased every time we rebuild the cube and the space is not cleared up. This is although we run both the StorageCleanupJob and metastore clean command. When looking into HDFS to see where the increase is I see that the