Hello,

I have following services are configured and installed successfully:

Hadoop 2.7.x
Spark 2.0.x
HBase 1.2.4
Hive 1.2.1

*Installation Directories:*

/usr/local/hadoop
/usr/local/spark
/usr/local/hbase

*Hive Environment variables:*

#HIVE VARIABLES START
export HIVE_HOME=/usr/local/hive
export PATH=$PATH:$HIVE_HOME/bin
#HIVE VARIABLES END

So, I can access Hive from anywhere as environment variables are
configured. Now if if i start my spark-shell & hive from location
/usr/local/hive then both work good for hive-metastore other wise from
where i start spark-shell where spark creates own meta-store.

i.e I am reading from HBase and Writing to Hive using Spark. I dont know
why this is weird issue is.




Thanks.

Reply via email to