Kylin respects SPARK_HOME env. Set it to use a specified spark.

On Wed, Nov 8, 2017 at 2:25 PM, lk_kylin <lk_ky...@163.com> wrote:

> yes,  /etc/profile will hive kylin find hive 、hbase 、spark . also you may
> need to export HCAT_HOME.
>
> 2017-11-08
> ------------------------------
> lk_kylin
> ------------------------------
>
> *发件人:*崔苗 <cuim...@danale.com>
> *发送时间:*2017-11-08 13:50
> *主题:* hadoop environment:
> *收件人:*"user"<user@kylin.apache.org>
> *抄送:*
>
> we have some problems in hadoop environment:
> 1、how does kylin find the hadoop environment ? if we export such as
> HIVE_HOME=/root/hive_2.10-0.10.0.0 into /etc/profile,could the profile
> file help kylin to find hive or hbase environment?
> 2、we have already installed spark2 in the cluster ,how to use this spark2
> instead of the spark within kylin?
>
>

Reply via email to