[
https://issues.apache.org/jira/browse/KYLIN-824?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14605285#comment-14605285
]
Shaofeng SHI commented on KYLIN-824:
------------------------------------
Hi Srinivasan, how did this patch be generated? I tried to merge the patch but
got errors:
{code}
git am -s -3 ~/Downloads/KYLIN-824.patch
Applying: Added support for external tables
Using index info to reconstruct a base tree...
error: patch failed:
dictionary/src/main/java/org/apache/kylin/dict/lookup/HiveTable.java:48
error: dictionary/src/main/java/org/apache/kylin/dict/lookup/HiveTable.java:
patch does not apply
Did you hand edit your patch?
It does not apply to blobs recorded in its index.
Cannot fall back to three-way merge.
Patch failed at 0001 Added support for external tables
The copy of the patch that failed is found in:
/Users/shaoshi/Documents/workspace/Kylin/.git/rebase-apply/patch
When you have resolved this problem, run "git am --continue".
If you prefer to skip this patch, run "git am --skip" instead.
To restore the original branch and stop patching, run "git am --abort".
LM-SHC-00950687:Kylin shaoshi$ git status
On branch 0.7-staging
Your branch is up-to-date with 'apache/0.7-staging'.
You are in the middle of an am session.
(fix conflicts and then run "git am --continue")
(use "git am --skip" to skip this patch)
(use "git am --abort" to restore the original branch)
nothing to commit, working directory clean
{code}
> Cube Build fails if lookup table doesn't have any files under HDFS location
> ---------------------------------------------------------------------------
>
> Key: KYLIN-824
> URL: https://issues.apache.org/jira/browse/KYLIN-824
> Project: Kylin
> Issue Type: Bug
> Components: Job Engine
> Affects Versions: v0.7.1
> Reporter: Srinivasan
> Assignee: Shaofeng SHI
> Fix For: v0.8.1, v0.7.2
>
> Attachments: KYLIN-824.patch, KYLIN-824.patch
>
>
> I have a dimension external table in Hive which is created using Hbase
> Storage handler. After creating the cube using this hive table cube build job
> failed in the "Build Dimension Dictionary" with below error
> java.lang.IllegalStateException: Expect 1 and only 1 non-zero file under
> hdfs://host:8020/user/hive/warehouse/hbase.db/department/, but find 0
> at org.apache.kylin.dict.lookup.HiveTable.findOnlyFile(HiveTable.java:123)
> at
> org.apache.kylin.dict.lookup.HiveTable.computeHDFSLocation(HiveTable.java:107)
> at org.apache.kylin.dict.lookup.HiveTable.getHDFSLocation(HiveTable.java:83)
> at org.apache.kylin.dict.lookup.HiveTable.getFileTable(HiveTable.java:76)
> at org.apache.kylin.dict.lookup.HiveTable.getSignature(HiveTable.java:71)
> at
> org.apache.kylin.dict.DictionaryManager.buildDictionary(DictionaryManager.java:164)
> at org.apache.kylin.cube.CubeManager.buildDictionary(CubeManager.java:154)
> at
> org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:53)
> at
> org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:42)
> at
> org.apache.kylin.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:53)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at
> org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:63)
> at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
> at
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50)
> at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
> at
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:132)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:744)
> Since external table created from other sources like Hbase hive doesn't store
> any data in its warehouse directory. So it should not check for files under
> warehouse dir for external tables. Please help.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)