Sorry, the configuration no longer exists。
You can modify spark's java.io.tmpdir information to avoid KE temporary
files being generated in the tmp directory.

On Fri, Jan 5, 2024 at 3:23 PM MINGMING GE <7mmi...@gmail.com> wrote:

> This directory cannot be deleted manually。
> This folder will be cleared by the crontab job of the system's temporary
> files. In centos, it is the /etc/cron.daily/tmpwatch scheduled task, which
> retains temporary files for up to 10 days.
>
> You can change its directory kylin.query.schema.tmp-directory through the
> following configuration.
>
> On Fri, Jan 5, 2024 at 3:06 PM Li, Can <c...@ebay.com.invalid> wrote:
>
>> 在load hive表时,spark会生成一些临时文件以及目录
>> 用于持久化表的metadata信息,这一部分在service里是有缓存的,我们在清理这些临时文件的时候发现,如果这些文件被删除了会导致load
>> table失败,
>>
>> 但是如果重启服务就正常,想请问一下这一块生成的临时文件能不能删除,或者多久能将其清理掉,是否有自动清理的功能。如果不能手动删除,是否能更改其生成文件时的目录,该怎么配置
>>
>

Reply via email to