Any inputs on this…. Its very important to have large no of columns in Tableau
worksheet. Pls. advise how can I achieve it?
Regards,
Manoj
From: Kumar, Manoj H
Sent: Monday, February 05, 2018 9:58 AM
To: 'user@kylin.apache.org'
Subject: RE: optimal parameters
Or is it
Hello,
I am sorry to come back again to you, but I am really stuck. I restarted all
the services and agents on my Cloudera cluster, so it is OK now.
I noticed that some HDFS directories were empty, especially /tmp/kylin. When I
have a look on the code of sample.sh, I can see that this script
Hi Jean-Luc,
Kylin's metadata is persisted in HBase, the metadata table is
"kylin_metadata" by default, which is also configurable in
conf/kylin.properties with key kylin.metadata.url;
The HDFS path '/kylin/kylin_metadata' is for the cube and other files;
Before you build the first cube, the
Hi Manoj,
In this case, splitting the dimensions into two cubes might not work; If
user selects a dimension in cube1 and another in cube2, neither cube1 nor
cube2 can answer;
Adding all them to one cube is doable, but please note the max physical
dimension # (exclude derived col in lookup
While running Spark Cube process, I noticed that this is taking other Cube
tables into the consideration , Rather it should take the cube which it
isdoing. Not sure why its taking data model of other cubes. Normally its being
noticed that Spark is taking almost same time as Maprecuce is taking.
Hi All,
we just set up Kylin (/apache/-/kylin/-/2.2.0/-/bin/-/cdh57/.tar.gz) and
able to run some examples, like kylin_sales_cube, but when we tried to
build cube on our production data on hive, we met an issue, google cant
help us much on this. It seems like a spark-related issue, but
Thanks Shaofeng… So Max. limit of Physical Dimension in one Cube – 64 (by using
mandatory/Hir/Join Hierarchy dimensions).
Regards,
Manoj
From: ShaoFeng Shi [mailto:shaofeng...@apache.org]
Sent: Tuesday, February 06, 2018 6:21 AM
To: user
Subject: Re: optimal parameters