Hi Xiaoxiang,

Thanks quickly response and your workarounds.
I tried the 2nd workaround (kylin.source.hive.quote-enabled=false), The cube 
can be built successfully.

Thanks,
Qiuyu

发送自 Windows 10 版邮件应用

发件人: Xiaoxiang Yu
发送时间: Thursday, June 20, 2019 2:12 PM
收件人: [email protected]; [email protected]
主题: Re: kylin build sample cube error when hive cleanup step

Hi, Qiuyu
  I think it is a bug, I am trying to find how to reproduce it.
  At the same time, I have a workaround which may help you, you should choose 
one: 
1. set kylin.source.hive.keep-flat-table=true in kylin.properties and restart, 
and submit a new build job. This will skip drop that hive table, you can drop 
it manually after built successfully
2. set kylin.source.hive.quote-enabled=false in kylin.properties and restart, 
and submit a new build job. This will remove quote character(that is ` 
backtick), which make drop table successfully.

----------------
Best wishes,
Xiaoxiang Yu 


发件人: Qiuyu <[email protected]>
答复: "[email protected]" <[email protected]>
日期: 2019年6月20日 星期四 13:24
收件人: "[email protected]" <[email protected]>
主题: kylin build sample cube error when hive cleanup step

 
Hi All,
 
My env is : HDP 2.6.5, kylin 2.6.2
When I build sample cube, it appear following error at  #22 Step Name: Hive 
Cleanup.
 
 
java.io.IOException: OS command error exit with return code: 64, error message: 
log4j:WARN No such property [maxFileSize] in 
org.apache.log4j.DailyRollingFileAppender.
 
Logging initialized using configuration in 
file:/etc/hive/2.6.5.1150-31/0/hive-log4j.properties
OK
Time taken: 1.641 seconds
FAILED: SemanticException Line 2:21 Table or database name may not contain 
dot(.) character 
'default.kylin_intermediate_kylin_sales_cube_1a8ba28a_7fbd_20e8_b2c1_328eca50648e'
The command is: 
hive -e "USE default;
DROP TABLE IF EXISTS 
\`default.kylin_intermediate_kylin_sales_cube_1a8ba28a_7fbd_20e8_b2c1_328eca50648e\`;
" --hiveconf hive.merge.mapredfiles=false --hiveconf 
hive.auto.convert.join=true --hiveconf dfs.replication=2 --hiveconf 
hive.exec.compress.output=true --hiveconf 
hive.auto.convert.join.noconditionaltask=true --hiveconf 
mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false 
--hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf 
hive.stats.autogather=true
         at 
org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:96)
         at 
org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:83)
         at 
org.apache.kylin.source.hive.GarbageCollectionStep.cleanUpIntermediateFlatTable(GarbageCollectionStep.java:77)
         at 
org.apache.kylin.source.hive.GarbageCollectionStep.doWork(GarbageCollectionStep.java:49)
         at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167)
         at 
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71)
         at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167)
         at 
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:114)
         at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
         at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
         at java.lang.Thread.run(Thread.java:745)
 
who can tell me if this is a bug? Or tell me how to resolve it?
 
Thanks,
Qiuyu.
 
发送自 Windows 10 版邮件应用
 

Reply via email to