Re: configure Zeppelin to use existing external hive metastore

2017-08-03 Thread Richard Xin
It works, Thanks Jeff!

On Wednesday, August 2, 2017, 6:24:34 PM PDT, Jeff Zhang  
wrote:


I suppose the %sql means %spark.sql, in that case you need to modify the 
hive-site.xml under SPARK_CONF_DIR

Richard Xin 于2017年8月3日周四 上午9:21写道:

on AWS EMRI am trying to bind zeppelin to an existing hive metastore, I 
modified hive-site.xml under /etc/hive/conf to pointing to existing metastore, 
and I tested with commandline hive, it works as expected.but under zeppelin 
%sql is still connected to previous hive metastore, I added line export 
HIVE_CONF_DIR=/etc/hive/conf
inside zeppelin-env.sh

And /etc/spark/conf's spark-env.sh has following line by default:export 
HIVE_CONF_DIR=${HIVE_CONF_DIR:-/etc/hive/conf}

What did I miss?

Thanks


Re: configure Zeppelin to use existing external hive metastore

2017-08-02 Thread Jeff Zhang
I suppose the %sql means %spark.sql, in that case you need to modify the
hive-site.xml under SPARK_CONF_DIR


Richard Xin 于2017年8月3日周四 上午9:21写道:

> on AWS EMR
> I am trying to bind zeppelin to an existing hive metastore, I modified
> hive-site.xml under /etc/hive/conf to pointing to existing metastore, and I
> tested with commandline hive, it works as expected.
> but under zeppelin %sql is still connected to previous hive metastore,
> I added line
> export HIVE_CONF_DIR=/etc/hive/conf
> inside zeppelin-env.sh
>
>
> And /etc/spark/conf's spark-env.sh has following line by default:
> export HIVE_CONF_DIR=${HIVE_CONF_DIR:-/etc/hive/conf}
>
> What did I miss?
>
> Thanks
>


configure Zeppelin to use existing external hive metastore

2017-08-02 Thread Richard Xin
on AWS EMRI am trying to bind zeppelin to an existing hive metastore, I 
modified hive-site.xml under /etc/hive/conf to pointing to existing metastore, 
and I tested with commandline hive, it works as expected.but under zeppelin 
%sql is still connected to previous hive metastore, I added line export 
HIVE_CONF_DIR=/etc/hive/conf
inside zeppelin-env.sh

And /etc/spark/conf's spark-env.sh has following line by default:export 
HIVE_CONF_DIR=${HIVE_CONF_DIR:-/etc/hive/conf}

What did I miss?

Thanks