It seems that you have a lookup table which doesn¹t define the join
relationship; Could you paste the full json of this cube definition?

On 3/4/15, 3:28 PM, "Santoshakhilesh" <[email protected]> wrote:

>Dear All ,
>
>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>defining normal and derived measures and play with it.
>
>         I have defined a new cube to test hierarchial dimensions and
>cube build is failed at Step 3 with following log in kylin.log
>
>         I have run the query which kylin provides on webui of cube on
>hive and it works.
>
>         Please let me know whats going wrong ? Any more info required
>from me please let me know.
>
>
>
>java.lang.NullPointerException
> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>
>
>
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(Snapsho
>tManager.java:156)] - Loading snapshotTable from
>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot,
>with loadData: false
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnapsho
>t(SnapshotManager.java:90)] - Identical input FileSignature
>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>lastModifiedTime=1425039202000], reuse existing snapshot at
>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>kylin_metadata_qa@hbase<mailto:kylin_metadata_qa@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCubeDe
>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCubeD
>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInvert
>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>from folder 
>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInver
>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index Desc(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.processS
>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run
>(CreateDictionaryJob.java:55)] -
>java.lang.NullPointerException
> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
> at 
>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGen
>eratorCLI.java:60)
> at 
>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGen
>eratorCLI.java:39)
> at 
>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.
>java:51)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> at 
>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:5
>73)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutpu
>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource
>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>kylin_metadata_qa@hbase<mailto:kylin_metadata_qa@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource
>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>kylin_metadata_qa@hbase<mailto:kylin_metadata_qa@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:87)] - Job status for
>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>updated.
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:89)] - output:Start to execute command:
> -cubename NDim -segmentname FULL_BUILD -input
>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
>Command execute return code 2
>
>
>
>Regards,
>Santosh Akhilesh
>Bangalore R&D
>HUAWEI TECHNOLOGIES CO.,LTD.
>
>www.huawei.com
>--------------------------------------------------------------------------
>-----------------------------------------------------------
>This e-mail and its attachments contain confidential information from
>HUAWEI, which
>is intended only for the person or entity whose address is listed above.
>Any use of the
>information contained herein in any way (including, but not limited to,
>total or partial
>disclosure, reproduction, or dissemination) by persons other than the
>intended
>recipient(s) is prohibited. If you receive this e-mail in error, please
>notify the sender by
>phone or email immediately and delete it!

Reply via email to