Hi Santosh, 0.6.5 should be stable, but there is an obvious error in this
cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this case
the “join” must be specified, while now it is null; This would cause Kylin
fail to join fact table with the lookup table. Please try to edit the cube
to add the join condition, let it looks like other dimensions; If the
wizard couldn’t work, try to manually add join and then use the “JSON
Editor” function to update the cube;

From now on we would suggest user to use 0.7.1; The binary package will be
much easier to install; and there are many enhancements and bug fixes in
0.7.1;

To get a fresh metadata store, just use a different htable, this is
configurable in kylin.properties:

kylin.metadata.url=kylin_metadata_qa@hbase

The default name is kylin_metadata_qa, if change a name, Kylin will get a
fresh metadata store;


On 3/4/15, 8:52 PM, "Santoshakhilesh" <[email protected]> wrote:

>Hi Shaofeng ,
>      
>      I had deleted the cube and tried to build again for hierarchy, now
>it fails in first step itself.
>      I ahve three dimension tables
>      a) customer_dim b_ store_dim c) item_dim
>      I choose left join with fact table while cerating dimensions , but
>after cube creation left join on sotre_dim is automatically deleted by
>kylin
>      store dimension has fields stoireid , city , state , I had tried to
>add the hierarchy dimension (1) State and (2) City.
>
>      I have started facing many issues , even with normal dimensions the
>kylin query result not matching with hive query.
>      I had also tried to delete all the metadata in hive and hbase (i
>had deleted all the entries) and started from beginning by creating a new
>project but now problem is persisting.
>
>      Do you suggest me to install binary distribution is it stable
>enough now ? If yes how do I make sure all the previous data is deleted.
>deleting hive and hbase data is enough or I should delete something else
>too ?
>
>      Logs are as below. Sorry for long mail.
>      
>       
>
>
>
>JSON:
>{
>  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
>  "name": "Hierarchy",
>  "description": "",
>  "dimensions": [
>    {
>      "id": 1,
>      "name": "AREA",
>      "join": null,
>      "hierarchy": [
>        {
>          "level": "1",
>          "column": "STATE"
>        },
>        {
>          "level": "2",
>          "column": "CITY"
>        }
>      ],
>      "table": "STORE_DIM",
>      "column": null,
>      "datatype": null,
>      "derived": null
>    },
>    {
>      "id": 2,
>      "name": "CUSTOMER_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "CUSTOMERID"
>        ],
>        "foreign_key": [
>          "CUSTOMERID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "CUSTOMER_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "NAME"
>      ]
>    },
>    {
>      "id": 3,
>      "name": "ITEM_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "ITEMID"
>        ],
>        "foreign_key": [
>          "ITEMID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "ITEM_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "TYPE",
>        "BRAND",
>        "COLOR"
>      ]
>    }
>  ],
>  "measures": [
>    {
>      "id": 1,
>      "name": "_COUNT_",
>      "function": {
>        "expression": "COUNT",
>        "parameter": {
>          "type": "constant",
>          "value": "1"
>        },
>        "returntype": "bigint"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 2,
>      "name": "TOTALAMOUNT",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "AMOUNT"
>        },
>        "returntype": "double"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 3,
>      "name": "TOTALQTY",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "QTY"
>        },
>        "returntype": "int"
>      },
>      "dependent_measure_ref": null
>    }
>  ],
>  "rowkey": {
>    "rowkey_columns": [
>      {
>        "column": "STATE",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CITY",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CUSTOMERID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "ITEMID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      }
>    ],
>    "aggregation_groups": [
>      [
>        "CUSTOMERID",
>        "ITEMID"
>      ],
>      [
>        "STATE",
>        "CITY"
>      ]
>    ]
>  },
>  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
>  "capacity": "MEDIUM",
>  "last_modified": 1425492494239,
>  "fact_table": "SALES_FACT",
>  "null_string": null,
>  "filter_condition": null,
>  "cube_partition_desc": {
>    "partition_date_column": null,
>    "partition_date_start": 0,
>    "cube_partition_type": "APPEND"
>  },
>  "hbase_mapping": {
>    "column_family": [
>      {
>        "name": "F1",
>        "columns": [
>          {
>            "qualifier": "M",
>            "measure_refs": [
>              "_COUNT_",
>              "TOTALAMOUNT",
>              "TOTALQTY"
>            ]
>          }
>        ]
>      }
>    ]
>  },
>  "notify_list": []
>}
>
>Logs:
>
>15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property
>hive.metastore.local no longer has any effect. Make sure to provide a
>valid value for hive.metastore.uris if you are connecting to a remote
>metastore.
>15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
>hive.metastore.local does not exist
>Logging initialized using configuration in
>jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-common-0.14.0.jar!/hive
>-log4j.properties
>SLF4J: Class path contains multiple SLF4J bindings.
>SLF4J: Found binding in
>[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1
>.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>SLF4J: Found binding in
>[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalone
>.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>explanation.
>SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>OK
>Time taken: 0.578 seconds
>OK
>Time taken: 0.444 seconds
>FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or
>column reference 'STORE_DIM': (possible column names are:
>sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
>sales_fact.qty, sales_fact.amount, customer_dim.customerid,
>customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
>item_dim.color)
>
>
>
>Regards,
>Santosh Akhilesh
>Bangalore R&D
>HUAWEI TECHNOLOGIES CO.,LTD.
>
>www.huawei.com
>--------------------------------------------------------------------------
>-----------------------------------------------------------
>This e-mail and its attachments contain confidential information from
>HUAWEI, which
>is intended only for the person or entity whose address is listed above.
>Any use of the
>information contained herein in any way (including, but not limited to,
>total or partial
>disclosure, reproduction, or dissemination) by persons other than the
>intended
>recipient(s) is prohibited. If you receive this e-mail in error, please
>notify the sender by
>phone or email immediately and delete it!
>
>________________________________________
>From: Shi, Shaofeng [[email protected]]
>Sent: Wednesday, March 04, 2015 5:36 PM
>To: [email protected]
>Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>dimension
>
>It seems that you have a lookup table which doesn¹t define the join
>relationship; Could you paste the full json of this cube definition?
>
>On 3/4/15, 3:28 PM, "Santoshakhilesh" <[email protected]> wrote:
>
>>Dear All ,
>>
>>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>>defining normal and derived measures and play with it.
>>
>>         I have defined a new cube to test hierarchial dimensions and
>>cube build is failed at Step 3 with following log in kylin.log
>>
>>         I have run the query which kylin provides on webui of cube on
>>hive and it works.
>>
>>         Please let me know whats going wrong ? Any more info required
>>from me please let me know.
>>
>>
>>
>>java.lang.NullPointerException
>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>>
>>
>>
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(Snapsh
>>o
>>tManager.java:156)] - Loading snapshotTable from
>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot,
>>with loadData: false
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnapsh
>>o
>>t(SnapshotManager.java:90)] - Identical input FileSignature
>>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>>lastModifiedTime=1425039202000], reuse existing snapshot at
>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>>kylin_metadata_qa@hbase<mailto:kylin_metadata_qa@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCubeD
>>e
>>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCube
>>D
>>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInver
>>t
>>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>>from folder
>>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInve
>>r
>>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index Desc(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.process
>>S
>>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.ru
>>n
>>(CreateDictionaryJob.java:55)] -
>>java.lang.NullPointerException
>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>> at
>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGe
>>n
>>eratorCLI.java:60)
>> at
>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGe
>>n
>>eratorCLI.java:39)
>> at
>>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob
>>.
>>java:51)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
>> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>> at
>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:
>>5
>>73)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutp
>>u
>>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource
>>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>>kylin_metadata_qa@hbase<mailto:kylin_metadata_qa@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource
>>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>>kylin_metadata_qa@hbase<mailto:kylin_metadata_qa@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:87)] - Job status for
>>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>>updated.
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_column
>>s
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:89)] - output:Start to execute command:
>> -cubename NDim -segmentname FULL_BUILD -input
>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_column
>>s
>>Command execute return code 2
>>
>>
>>
>>Regards,
>>Santosh Akhilesh
>>Bangalore R&D
>>HUAWEI TECHNOLOGIES CO.,LTD.
>>
>>www.huawei.com
>>-------------------------------------------------------------------------
>>-
>>-----------------------------------------------------------
>>This e-mail and its attachments contain confidential information from
>>HUAWEI, which
>>is intended only for the person or entity whose address is listed above.
>>Any use of the
>>information contained herein in any way (including, but not limited to,
>>total or partial
>>disclosure, reproduction, or dissemination) by persons other than the
>>intended
>>recipient(s) is prohibited. If you receive this e-mail in error, please
>>notify the sender by
>>phone or email immediately and delete it!

Reply via email to