[ 
https://issues.apache.org/jira/browse/KYLIN-824?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Han resolved KYLIN-824.
----------------------------
    Resolution: Fixed

> Cube Build fails if lookup table doesn't have any files under HDFS location
> ---------------------------------------------------------------------------
>
>                 Key: KYLIN-824
>                 URL: https://issues.apache.org/jira/browse/KYLIN-824
>             Project: Kylin
>          Issue Type: Bug
>          Components: Job Engine
>    Affects Versions: v0.7.1
>            Reporter: Srinivasan
>            Assignee: Shaofeng SHI
>             Fix For: v0.8.1, v0.7.2
>
>         Attachments: FileTable.java, HiveClient.java, HiveTable.java, 
> KYLIN-824.patch, KYLIN-824.patch, KYLIN-824.patch
>
>
> I have a dimension external table in Hive which is created using Hbase 
> Storage handler. After creating the cube using this hive table cube build job 
> failed in the "Build Dimension Dictionary" with below error
> java.lang.IllegalStateException: Expect 1 and only 1 non-zero file under 
> hdfs://host:8020/user/hive/warehouse/hbase.db/department/, but find 0
> at org.apache.kylin.dict.lookup.HiveTable.findOnlyFile(HiveTable.java:123)
> at 
> org.apache.kylin.dict.lookup.HiveTable.computeHDFSLocation(HiveTable.java:107)
> at org.apache.kylin.dict.lookup.HiveTable.getHDFSLocation(HiveTable.java:83)
> at org.apache.kylin.dict.lookup.HiveTable.getFileTable(HiveTable.java:76)
> at org.apache.kylin.dict.lookup.HiveTable.getSignature(HiveTable.java:71)
> at 
> org.apache.kylin.dict.DictionaryManager.buildDictionary(DictionaryManager.java:164)
> at org.apache.kylin.cube.CubeManager.buildDictionary(CubeManager.java:154)
> at 
> org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:53)
> at 
> org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:42)
> at 
> org.apache.kylin.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:53)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at 
> org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:63)
> at 
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
> at 
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50)
> at 
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
> at 
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:132)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:744)
> Since external table created from other sources like Hbase hive doesn't store 
> any data in its warehouse directory. So it should not check for files under 
> warehouse dir for external tables. Please help.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to