Kylin defects hadoop deployment by running shell commands, like `hadoop`,
`hive`, `hbase`. Make sure your wanted version are on path.

Set SPARK_HOME to let Kylin use a specified Spark instead of the shipped
one.

On Wed, Nov 8, 2017 at 11:59 AM, 崔苗 <[email protected]> wrote:

> we have some problems in hadoop environment:
> 1、how does kylin find the hadoop environment ? if we export such as
> HIVE_HOME=/root/hive_2.10-0.10.0.0 into /etc/profile,could the profile
> file help kylin to find hive or hbase environment?
> 2、we have already installed spark2 in the cluster ,how to use this spark2
> instead of the spark within kylin?
>
>
>
>
>
>

Reply via email to