Hi, Ketan

AFAIK, now kylin working directory just only supports HDFS or some other file 
systems which are compatible with HDFS API.

You can have a try to cleanup garbage file by using Kylin storage cleanup tool. 
It should reduce your data size.

Thanks
Zhengshuai

> On Jan 24, 2019, at 4:03 PM, kdcool6932 <[email protected]> wrote:
> 
> Hi Kylin,We are having a a lot of data in our hdfs working directory, around 
> 10tb , for last one year or so, this is acutally more than the hbase usage of 
> kylin(around 9TB) on one of our kylin cluster. We are using kylin 2.3.1 on 
> this cluster.1. Are all these files required for Kylin functionality ??2. Is 
> there a way to clean them up(kylin clean up job is not helping here), and 
> keep only required data on hdfs??3. Also does this storage needs to be on 
> hdfs only, or can we point it to some non dfs storage, like local FS or s3 
> bucket ?
> This might help us in reducing our hdfs storage and using it more judiciously.
> Thanks,[email protected] 
> 
> 
> Sent from my Samsung Galaxy smartphone.

Reply via email to