I ran this before, actually the hive-site.xml works in this way for me (the 
tricky happens in the new HiveConf(classOf[SessionState]), can you double check 
if hive-site.xml can be loaded in the class path? It supposes to appear in the 
root of the class path.

-----Original Message-----
From: nikroy16 [mailto:nikro...@gmail.com] 
Sent: Tuesday, July 29, 2014 12:51 PM
To: u...@spark.incubator.apache.org
Subject: HiveContext is creating metastore warehouse locally instead of in hdfs

Hi,

Even though hive.metastore.warehouse.dir in hive-site.xml is set to the default 
user/hive/warehouse and the permissions are correct in hdfs, HiveContext seems 
to be creating metastore locally instead of hdfs. After looking into the spark 
code, I found the following in HiveContext.scala:

   /**
* SQLConf and HiveConf contracts: when the hive session is first initialized, 
params in


* HiveConf will get picked up by the SQLConf. Additionally, any properties set 
by


* set() or a SET command inside hql() or sql() will be set in the SQLConf *as 
well as*


* in the HiveConf.
*/
  @transient protected[hive] lazy val hiveconf = new
HiveConf(classOf[SessionState])


  @transient protected[hive] lazy val sessionState = {


    val ss = new SessionState(hiveconf)


    set(hiveconf.getAllProperties) // Have SQLConf pick up the initial set of 
HiveConf.


    ss
  }


It seems as though when a HiveContext is created, it is launched without any 
configuration and hive-site.xml is not used to set properties. It looks like I 
can set properties after creation by using hql() method but what I am looking 
for is for the hive context to be initialized according to the configuration in 
hive-site.xml at the time of initialization. Any help would be greatly 
appreciated!





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/HiveContext-is-creating-metastore-warehouse-locally-instead-of-in-hdfs-tp10838.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to