Hi,

I am trying to connect to a hive metastore deployed in a oracle db. I have
the hive configuration
specified in the hive-site.xml. I put the hive-site.xml under
$SPARK_HOME/conf. If I run spark-shell,
everything works fine. I can create hive database, tables and query the
tables.

However, when I try to do that in a spark application, running in local
mode, i.e., I have
sparkConf.setMaster("local[*]").setSparkHome(<my spark home installation>),
it does not seem
to pick up the hive-site.xml. It still uses the local derby Hive metastore
instead of the oracle
metastore that I defined in hive-site.xml. If I add the hive-site.xml
explicitly on the classpath, I am
getting the following error:

Caused by: org.datanucleus.api.jdo.exceptions.TransactionNotActiveException:
Transaction is not active. You either need to define a transaction around
this, or run your PersistenceManagerFactory with 'NontransactionalRead' and
'NontransactionalWrite' set to 'true'
FailedObject:org.datanucleus.exceptions.TransactionNotActiveException:
Transaction is not active. You either need to define a transaction around
this, or run your PersistenceManagerFactory with 'NontransactionalRead' and
'NontransactionalWrite' set to 'true'
    at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:396)
    at
org.datanucleus.api.jdo.JDOTransaction.rollback(JDOTransaction.java:186)
    at
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:204)
    at
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137)
    at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
    at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
    at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
    at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
    at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
    at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
    at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
 

This happens when I try to new a HiveContext in my code.

How do I ask Spark to look at the hive-site.xml in the $SPARK_HOME/conf
directory in my spark application?

Thanks very much. Any pointer will be much appreciated.

Regards,

Antonio.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/create-hive-context-in-spark-application-tp26496.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to