Yes you can install the hadoop cluster by yourself; would you please run $KYLIN_HOME/bin/find-hive-dependency.sh, and then copy the output here? BTW, where you saw this error, in kylin.log or in a map-red job’s log?
On 5/18/15, 2:48 PM, "190184501" <[email protected]> wrote: >hi,i use apache hive0.14, and i donnot install hcatalog , hcatalog is >integrate in hive since hive0.11 > > >and i use apahce hive0.14 so i think hcatalog is installed already . > and the hcatalog jar location is: >apache-hive-0.14.0-bin\hcatalog\share\hcatalog\hive-hcatalog-core-0.14.0.j >ar ,so i think hcatalog version is 0.14 > > >here is my question : >1 if i use apache environmen eg: apache hive0.14, not hotonworks , >not cloudera, > shoule i do something more configure with hcatalog?? > > >2 how do i know the version of Hcatalo? > > > > > > > > > > >------------------ Original ------------------ >From: "Shi, Shaofeng";<[email protected]>; >Date: Mon, May 18, 2015 02:34 PM >To: "[email protected]"<[email protected]>; > >Subject: Re: build kylin with a sample Cube error for help > > > >what¹s the version of Hcatalog that you installed? This indicates the >hcatalog-core jar is old, suggest upgrade to v0.14; > >On 5/18/15, 2:21 PM, "190184501" <[email protected]> wrote: > >>can someone help me ?? this problem trouble me for serveral days >> >> >> >> >>------------------ Original ------------------ >>From: "190184501";<[email protected]>; >>Date: Mon, May 18, 2015 01:56 PM >>To: "dev"<[email protected]>; >> >>Subject: build kylin with a sample Cube error for help >> >> >> >>hi all, >> >> >>I build a sample Cube and the error is :e >> >> >>java.lang.IncompatibleClassChangeError: Found interface >>org.apache.mapreduce.JobContext, but class was expected: >> at >>org.apache,hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HcataogI >>n >>putFormat.java 102) >> >> >> >> >>is my environment wrong ?? >> >> >>the environment is : >> >> >>apache hive0.14 >>apache hadoop2.4 >>apache hbase0.98 >>apache zookeeper3.4.5
