About Q2: I'm agree with you, I think is a *issue* To start Kylin you must check exists one Source, one Engine and one Storage system (for example, is not necesary have Hive and Kafka)
Example Spark <https://github.com/apache/kylin/blob/c38def7b53dae81f9fde0520b1fb270804dde728/build/bin/find-spark-dependency.sh#L38> On 16 June 2017 at 13:17, skyyws <[email protected]> wrote: > For Q3, you can try to make soft links for both hdfs-site.xml and > mapred-site.xml. > > 2017-06-16 > > skyyws > > > > 发件人:"lxw" <[email protected]> > 发送时间:2017-06-13 11:41 > 主题:Some questions about Kylin2.0 > 收件人:"dev"<[email protected]>,"user"<[email protected]> > 抄送: > > Hi,All : > > I have some questions about Kylin2.0, and my environment: > hadoop-2.6.0-cdh5.8.3 > hbase-1.2.0-cdh5.8.3 > apache-kylin-2.0.0-bin-cdh57 > spark-2.1.0-bin-hadoop2.6 > > > Q1: Kylin2.0 not support Spark2.0? > > find-spark-dependency.sh: > spark_dependency=`find -L $spark_home -name > 'spark-assembly-[a-z0-9A-Z\.-]*.jar' .... > > > Q2: I want to use Kylin2.0 without Spark Cubing, but failed. > > > kylin.sh: > function retrieveDependency() { > #retrive $hive_dependency and $hbase_dependency > source ${dir}/find-hive-dependency.sh > source ${dir}/find-hbase-dependency.sh > source ${dir}/find-hadoop-conf-dir.sh > source ${dir}/find-kafka-dependency.sh > source ${dir}/find-spark-dependency.sh > > > If not found spark dependencies, Kylin can not start : > > [hadoop@hadoop10 bin]$ ./kylin.sh start > Retrieving hadoop conf dir... > KYLIN_HOME is set to /home/hadoop/bigdata/kylin/current > Retrieving hive dependency... > Retrieving hbase dependency... > Retrieving hadoop conf dir... > Retrieving kafka dependency... > Retrieving Spark dependency... > spark assembly lib not found. > > > after modify kylin.sh “####source ${dir}/find-spark-dependency.sh”, > Kylin start success .. > > > Q3: Abount kylin_hadoop_conf_dir ? > > I make some soft link under $KYLIN_HOME/hadoop-conf > (core-site.xml、yarn-site.xml、hbase-site.xml、hive-site.xml), > and set > "kylin.env.hadoop-conf-dir=/home/bigdata/kylin/current/hadoop-conf", > when I execute ./check-env.sh, > > > [hadoop@hadoop10 bin]$ ./check-env.sh > Retrieving hadoop conf dir... > /home/bigdata/kylin/current/hadoop-conf is override as the > kylin_hadoop_conf_dir > KYLIN_HOME is set to /home/hadoop/bigdata/kylin/current > -mkdir: java.net.UnknownHostException: cdh5 > Usage: hadoop fs [generic options] -mkdir [-p] <path> ... > Failed to create /kylin20. Please make sure the user has right to > access /kylin20 > > > My HDFS with HA, fs.defaultFS is "cdh5",when I don't set > "kylin.env.hadoop-conf-dir", and use HADOOP_CONF_DIR, HIVE_CONF, > HBASE_CONF_DIR from envionment variables (/etc/profile), it was correct. > > > Best Regards! > lxw >
