Re: Cleaning up hdfs working directory

2019-01-24 Thread PENG Zhengshuai
Hi, Ketan AFAIK, now kylin working directory just only supports HDFS or some other file systems which are compatible with HDFS API. You can have a try to cleanup garbage file by using Kylin storage cleanup tool. It should reduce your data size. Thanks Zhengshuai > On Jan 24, 2019, at 4:03 PM,

Re: Cleaning up hdfs working directory

2019-01-24 Thread JiaTao Tao
Hi Take a look at this: http://kylin.apache.org/docs/howto/howto_cleanup_storage.html kdcool6932 于2019年1月24日周四 上午8:04写道: > Hi Kylin,We are having a a lot of data in our hdfs working directory, > around 10tb , for last one year or so, this is acutally more than the hbase > usage of kylin(around 9

Cleaning up hdfs working directory

2019-01-24 Thread kdcool6932
Hi Kylin,We are having a a lot of data in our hdfs working directory, around 10tb , for last one year or so, this is acutally more than the hbase usage of kylin(around 9TB) on one of our kylin cluster. We are using kylin 2.3.1 on this cluster.1. Are all these files required for Kylin functionali