to be honest we barely had the bandwidth to test on chd,
i would recommend the latest cdh

On Fri, May 8, 2015 at 3:54 PM, Abhijit Deka <[email protected]>
wrote:

> Please let me know which version of CHD should I download.(tested with all
> the Kylin components)
>
> Regards
>
> Abhijit Deka
> Computer Scientist
> Adobe Systems
> Bangalore
>
> Ph-+91 80884 39067
>
>
>
>   On Friday, 8 May 2015 12:33 PM, hongbin ma <[email protected]> wrote:
>
>
> hi,
> cdh 5.1 is using hive 0.12, however we require 0.14 at least
> to save all the troubles, you can change to hdp 2.2 if possible
>
> On Fri, May 8, 2015 at 2:52 PM, Abhijit Deka <[email protected]>
> wrote:
>
> Hi,
>
> That issue is resolved when i moved from External table.
>
> but I am getitng a new error now .In the same build step I am getting this
>
> java.lang.IncompatibleClassChangeError: Found class 
> org.apache.hive.hcatalog.data.transfer.ReaderContext, but interface was 
> expected
>       at 
> org.apache.kylin.dict.lookup.HiveTableReader.initialize(HiveTableReader.java:85)
>       at 
> org.apache.kylin.dict.lookup.HiveTableReader.<init>(HiveTableReader.java:74)
>       at 
> org.apache.kylin.dict.lookup.HiveTableReader.<init>(HiveTableReader.java:60)
>       at org.apache.kylin.dict.lookup.HiveTable.getReader(HiveTable.java:66)
>       at 
> org.apache.kylin.dict.DictionaryGenerator.loadColumnValues(DictionaryGenerator.java:173)
>       at 
> org.apache.kylin.dict.DictionaryGenerator.buildDictionary(DictionaryGenerator.java:112)
>       at 
> org.apache.kylin.dict.DictionaryManager.buildDictionary(DictionaryManager.java:172)
>       at 
> org.apache.kylin.cube.CubeManager.buildDictionary(CubeManager.java:154)
>       at 
> org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:53)
>       at 
> org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:42)
>       at 
> org.apache.kylin.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:53)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>       at 
> org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:63)
>       at 
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>       at 
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50)
>       at 
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>       at 
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:132)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
>
> Do you know what is causing this error?
>
> Regards
>
> Abhijit Deka
> Computer Scientist
> Adobe Systems
> Bangalore
>
> Ph-+91 80884 39067
>
>
>
>   On Friday, 8 May 2015 11:09 AM, hongbin ma <[email protected]> wrote:
>
>
> what will you get if you run
> hadoop fs -cat /user/hive/warehouse/glaastest_dim_vendor ?
>
> On Fri, May 8, 2015 at 1:37 PM, Abhijit Deka <[email protected]>
> wrote:
>
> Hi Shi,
>
> It's Hive.
>
> Regards
>
> Abhijit Deka
> Computer Scientist
> Adobe Systems
> Bangalore
>
> Ph-+91 80884 39067
>
>
>
>   On Friday, 8 May 2015 10:09 AM, "Shi, Shaofeng" <[email protected]>
> wrote:
>
>
> What’s the owner info for
> hdfs://quickstart.cloudera:8020/user/hive/warehouse/glaastest_dim_vendor ?
> Didn’t see it here.
>
> On 5/8/15, 12:35 PM, "Abhijit Deka" <[email protected]> wrote:
>
> >Hi Bin,
> >Thanks for the reply.I have already mentioned that I have verified and
> >the file exists in HDFS.But the owner is Hive not Cloudera.
> >drwxrwxrwx  - hive    hive                0 2015-05-07 04:22
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/glaastest_dim_product
> >drwxrwxrwx  - hive    hive                0 2015-05-07 04:20
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/glaastest_fact
> >drwxrwxrwx  - cloudera supergroup          0 2015-05-04 07:18
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/kylin_cal_dt
> >drwxrwxrwx  - cloudera supergroup          0 2015-05-04 07:18
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/kylin_category_groupin
> >gs
> >drwxrwxrwx  - cloudera supergroup          0 2015-05-04 07:18
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/kylin_sales
> >
> >For the example bundled with Kylin the user is Cloudera. Is it some kind
> >of permission issue I am running into.Meanwhile i'll try to run it with
> >hive table (not external)
> >
> > Regards
> >Abhijit DekaComputer ScientistAdobe SystemsBangalore
> >Ph-+91 80884 39067
> >
> >
> >    On Friday, 8 May 2015 6:59 AM, hongbin ma <[email protected]>
> >wrote:
> >
> >
> > for all the lookup tables, for example glaastest_dim_vendor  in your
> >case, kylin asks hive the hdfs location of it, and will try to read the
> >hdfs file directly.
> >it seems hive returned the location:
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/glaastest_dim_vendor
> >does , but the file does not exist. can you verify why hive returned such
> >location(as you described it's a external table in hbase?)
> >An alternative is that you convert all the lookup tables into
> >non-external tables
> >On Thu, May 7, 2015 at 4:40 PM, Abhijit Deka
> ><[email protected]> wrote:
> >
> >Forgot to add the machine info.I am running it in CHD 5.1.0
> >virtualbox.Not a cluster.
> > Regards
> >Abhijit DekaComputer ScientistAdobe SystemsBangalore
> >Ph-+91 80884 39067
> >
> >
> >    On Thursday, 7 May 2015 1:36 AM, Abhijit Deka
> ><[email protected]> wrote:
> >
> >
> > Hi,
> >I am new to Kylin and i was trying out a simple cube.I did the below
> >steps.Please let me know if i have done anything wrong.
> >1.Done ETL in Pig.2.Stored the Fact and Dim tables in Hbase.3.Created
> >external table in Hive to query the data and also use in Kylin4.Created
> >the cube as instructed in documentation.5.Build the cube.
> >But the cube build failed in 3rd step.
> >Below are the logs
> >java.io.FileNotFoundException: File
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/glaastest_dim_vendor
> >does not exist.
> >    at
> >org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(Distribute
> >dFileSystem.java:654)
> >    at
> >org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSys
> >tem.java:102)
> >    at
> >org.apache.hadoop.hdfs.DistributedFileSystem$14.doCall(DistributedFileSyst
> >em.java:712)
> >    at
> >org.apache.hadoop.hdfs.DistributedFileSystem$14.doCall(DistributedFileSyst
> >em.java:708)
> >    at
> >org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver
> >.java:81)
> >    at
> >org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSys
> >tem.java:708)
> >    at
> >org.apache.kylin.dict.lookup.HiveTable.findOnlyFile(HiveTable.java:116)
> >    at
> >org.apache.kylin.dict.lookup.HiveTable.computeHDFSLocation(HiveTable.java:
> >107)
> >    at
> >org.apache.kylin.dict.lookup.HiveTable.getHDFSLocation(HiveTable.java:83)
> >    at
> >org.apache.kylin.dict.lookup.HiveTable.getFileTable(HiveTable.java:76)
> >    at
> >org.apache.kylin.dict.lookup.HiveTable.getSignature(HiveTable.java:71)
> >    at
> >org.apache.kylin.dict.DictionaryManager.buildDictionary(DictionaryManager.
> >java:164)
> >    at
> >org.apache.kylin.cube.CubeManager.buildDictionary(CubeManager.java:154)
> >    at
> >org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(Dictionary
> >GeneratorCLI.java:53)
> >    at
> >org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(Dictionary
> >GeneratorCLI.java:42)
> >    at
> >org.apache.kylin.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJ
> >ob.java:53)
> >    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> >    at
> >org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecut
> >able.java:63)
> >    at
> >org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutab
> >le.java:107)
> >    at
> >org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChai
> >nedExecutable.java:50)
> >    at
> >org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutab
> >le.java:107)
> >    at
> >org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(Defaul
> >tScheduler.java:132)
> >    at
> >java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
> >1145)
> >    at
> >java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java
> >:615)
> >    at java.lang.Thread.run(Thread.java:745)
> >Now when I check for the file in HDFS the file is there but it looks like
> >a permission issue.(??)
> >drwxrwxrwx  - hive    hive                0 2015-05-07 04:22
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/glaastest_dim_product
> >drwxrwxrwx  - hive    hive                0 2015-05-07 04:20
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/glaastest_fact
> >drwxrwxrwx  - cloudera supergroup          0 2015-05-04 07:18
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/kylin_cal_dt
> >drwxrwxrwx  - cloudera supergroup          0 2015-05-04 07:18
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/kylin_category_groupin
> >gs
> >drwxrwxrwx  - cloudera supergroup          0 2015-05-04 07:18
> >hdfs://quickstart.cloudera:8020/user/hive/warehouse/kylin_sales
> >How do i solve this issue.Thanks in advance.
> > Regards
> >Abhijit DekaComputer ScientistAdobe SystemsBangalore
> >
> >
> >
> >
> >
> >
> >--
> >Regards,
> >Bin Mahone | 马洪宾
> >Apache Kylin: http://kylin.io
> >Github: https://github.com/binmahone
> >
> >
>
>
>
>
>
>
> --
> Regards,
>
> *Bin Mahone | 马洪宾*
> Apache Kylin: http://kylin.io
> Github: https://github.com/binmahone
>
>
>
>
>
> --
> Regards,
>
> *Bin Mahone | 马洪宾*
> Apache Kylin: http://kylin.io
> Github: https://github.com/binmahone
>
>
>


-- 
Regards,

*Bin Mahone | 马洪宾*
Apache Kylin: http://kylin.io
Github: https://github.com/binmahone

Reply via email to