Re: create hive context in spark application

2016-03-15 Thread Antonio Si
Thanks Akhil.

Yes, spark-shell works fine.

In my app, I have a Restful service and from the Restful service, I am
calling the spark-api to do some hiveql. That's why
I am not using spark-submit.

Thanks.

Antonio.

On Tue, Mar 15, 2016 at 12:02 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Did you ry submitting your application with spark-submit
> <http://spark.apache.org/docs/latest/submitting-applications.html>? You
> can also try opening a spark-shell and see if it picks up your
> hive-site.xml.
>
> Thanks
> Best Regards
>
> On Tue, Mar 15, 2016 at 11:58 AM, antoniosi <antonio...@gmail.com> wrote:
>
>> Hi,
>>
>> I am trying to connect to a hive metastore deployed in a oracle db. I have
>> the hive configuration
>> specified in the hive-site.xml. I put the hive-site.xml under
>> $SPARK_HOME/conf. If I run spark-shell,
>> everything works fine. I can create hive database, tables and query the
>> tables.
>>
>> However, when I try to do that in a spark application, running in local
>> mode, i.e., I have
>> sparkConf.setMaster("local[*]").setSparkHome(> installation>),
>> it does not seem
>> to pick up the hive-site.xml. It still uses the local derby Hive metastore
>> instead of the oracle
>> metastore that I defined in hive-site.xml. If I add the hive-site.xml
>> explicitly on the classpath, I am
>> getting the following error:
>>
>> Caused by:
>> org.datanucleus.api.jdo.exceptions.TransactionNotActiveException:
>> Transaction is not active. You either need to define a transaction around
>> this, or run your PersistenceManagerFactory with 'NontransactionalRead'
>> and
>> 'NontransactionalWrite' set to 'true'
>> FailedObject:org.datanucleus.exceptions.TransactionNotActiveException:
>> Transaction is not active. You either need to define a transaction around
>> this, or run your PersistenceManagerFactory with 'NontransactionalRead'
>> and
>> 'NontransactionalWrite' set to 'true'
>> at
>>
>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:396)
>> at
>> org.datanucleus.api.jdo.JDOTransaction.rollback(JDOTransaction.java:186)
>> at
>>
>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:204)
>> at
>>
>> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.(MetaStoreDirectSql.java:137)
>> at
>>
>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
>> at
>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
>> at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
>> at
>>
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>> at
>>
>> org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:57)
>> at
>>
>> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
>> at
>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
>> at
>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
>> at
>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
>> at
>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
>> at
>>
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:66)
>> at
>>
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
>> at
>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
>> at
>>
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:199)
>> at
>>
>> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
>>
>> This happens when I try to new a HiveContext in my code.
>>
>> How do I ask Spark to look at the hive-site.xml in the $SPARK_HOME/conf
>> directory in my spark application?
>>
>> Thanks very much. Any pointer will be much appreciated.
>>
>> Regards,
>>
>> Antonio.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/create-hive-context-in-spark-application-tp26496.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: create hive context in spark application

2016-03-15 Thread Akhil Das
Did you ry submitting your application with spark-submit
<http://spark.apache.org/docs/latest/submitting-applications.html>? You can
also try opening a spark-shell and see if it picks up your hive-site.xml.

Thanks
Best Regards

On Tue, Mar 15, 2016 at 11:58 AM, antoniosi <antonio...@gmail.com> wrote:

> Hi,
>
> I am trying to connect to a hive metastore deployed in a oracle db. I have
> the hive configuration
> specified in the hive-site.xml. I put the hive-site.xml under
> $SPARK_HOME/conf. If I run spark-shell,
> everything works fine. I can create hive database, tables and query the
> tables.
>
> However, when I try to do that in a spark application, running in local
> mode, i.e., I have
> sparkConf.setMaster("local[*]").setSparkHome(),
> it does not seem
> to pick up the hive-site.xml. It still uses the local derby Hive metastore
> instead of the oracle
> metastore that I defined in hive-site.xml. If I add the hive-site.xml
> explicitly on the classpath, I am
> getting the following error:
>
> Caused by:
> org.datanucleus.api.jdo.exceptions.TransactionNotActiveException:
> Transaction is not active. You either need to define a transaction around
> this, or run your PersistenceManagerFactory with 'NontransactionalRead' and
> 'NontransactionalWrite' set to 'true'
> FailedObject:org.datanucleus.exceptions.TransactionNotActiveException:
> Transaction is not active. You either need to define a transaction around
> this, or run your PersistenceManagerFactory with 'NontransactionalRead' and
> 'NontransactionalWrite' set to 'true'
> at
>
> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:396)
> at
> org.datanucleus.api.jdo.JDOTransaction.rollback(JDOTransaction.java:186)
> at
>
> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:204)
> at
>
> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.(MetaStoreDirectSql.java:137)
> at
>
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
> at
>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
> at
>
> org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:57)
> at
>
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
> at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
> at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
> at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
> at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
> at
>
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:66)
> at
>
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
> at
>
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
> at
>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:199)
> at
>
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
>
> This happens when I try to new a HiveContext in my code.
>
> How do I ask Spark to look at the hive-site.xml in the $SPARK_HOME/conf
> directory in my spark application?
>
> Thanks very much. Any pointer will be much appreciated.
>
> Regards,
>
> Antonio.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/create-hive-context-in-spark-application-tp26496.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


create hive context in spark application

2016-03-15 Thread antoniosi
Hi,

I am trying to connect to a hive metastore deployed in a oracle db. I have
the hive configuration
specified in the hive-site.xml. I put the hive-site.xml under
$SPARK_HOME/conf. If I run spark-shell,
everything works fine. I can create hive database, tables and query the
tables.

However, when I try to do that in a spark application, running in local
mode, i.e., I have
sparkConf.setMaster("local[*]").setSparkHome(),
it does not seem
to pick up the hive-site.xml. It still uses the local derby Hive metastore
instead of the oracle
metastore that I defined in hive-site.xml. If I add the hive-site.xml
explicitly on the classpath, I am
getting the following error:

Caused by: org.datanucleus.api.jdo.exceptions.TransactionNotActiveException:
Transaction is not active. You either need to define a transaction around
this, or run your PersistenceManagerFactory with 'NontransactionalRead' and
'NontransactionalWrite' set to 'true'
FailedObject:org.datanucleus.exceptions.TransactionNotActiveException:
Transaction is not active. You either need to define a transaction around
this, or run your PersistenceManagerFactory with 'NontransactionalRead' and
'NontransactionalWrite' set to 'true'
at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:396)
at
org.datanucleus.api.jdo.JDOTransaction.rollback(JDOTransaction.java:186)
at
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:204)
at
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.(MetaStoreDirectSql.java:137)
at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:57)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:66)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:199)
at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
 

This happens when I try to new a HiveContext in my code.

How do I ask Spark to look at the hive-site.xml in the $SPARK_HOME/conf
directory in my spark application?

Thanks very much. Any pointer will be much appreciated.

Regards,

Antonio.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/create-hive-context-in-spark-application-tp26496.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org