Re: spark 1.6.0 connect to hive metastore

2016-03-23 Thread Koert Kuipers
can someone provide the correct settings for spark 1.6.1 to work with cdh 5
(hive 1.1.0)?

in particular the settings for:
spark.sql.hive.version
spark.sql.hive.metastore.jars

also it would be helpful to know if your spark jar includes hadoop
dependencies or not.

i realize it works (or at least seems to work) if you simply set the
spark.sql.hive.version to 1.2.1 and spark.sql.hive.metastore.jars to
builtin, but i find it somewhat unsatisfactory to rely on that happy
coincidence.


On Sat, Mar 12, 2016 at 7:09 PM, Timur Shenkao  wrote:

> I had similar issue with CDH 5.5.3.
> Not only with Spark 1.6 but with beeline as well.
> I resolved it via installation & running hiveserver2 role instance at the
> same server wher metastore is. 
>
> On Tue, Feb 9, 2016 at 10:58 PM, Koert Kuipers  wrote:
>
>> has anyone successfully connected to hive metastore using spark 1.6.0? i
>> am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
>> launching spark with yarn.
>>
>> this is what i see in logs:
>> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
>> with URI thrift://metastore.mycompany.com:9083
>> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>>
>> and then a little later:
>>
>> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
>> version 1.2.1
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
>> 2.6.0-cdh5.4.4
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
>> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
>> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
>> hive.server2.enable.impersonation does not exist
>> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store with
>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
>> called
>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>> hive.metastore.integral.jdo.pushdown unknown - will be ignored
>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>> datanucleus.cache.level2 unknown - will be ignored
>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>> present in CLASSPATH (or one of dependencies)
>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>> present in CLASSPATH (or one of dependencies)
>> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
>> hive.server2.enable.impersonation does not exist
>> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object
>> pin classes with
>> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
>> underlying DB is DERBY
>> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>   at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>>   at
>> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>>   at
>> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>>   at
>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>>   at
>> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>>   at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
>>   at
>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
>>   at
>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
>>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>>   at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>>   at org.apache.spark.sql.SQLContext.(SQLContext.scala:271)
>>   at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:97)
>>   at 

Re: spark 1.6.0 connect to hive metastore

2016-03-12 Thread Timur Shenkao
I had similar issue with CDH 5.5.3.
Not only with Spark 1.6 but with beeline as well.
I resolved it via installation & running hiveserver2 role instance at the
same server wher metastore is. 

On Tue, Feb 9, 2016 at 10:58 PM, Koert Kuipers  wrote:

> has anyone successfully connected to hive metastore using spark 1.6.0? i
> am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
> launching spark with yarn.
>
> this is what i see in logs:
> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore with
> URI thrift://metastore.mycompany.com:9083
> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>
> and then a little later:
>
> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
> version 1.2.1
> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
> 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
> called
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
> datanucleus.cache.level2 unknown - will be ignored
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object pin
> classes with
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
> underlying DB is DERBY
> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>   at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>   at
> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>   at
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>   at
> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>   at
> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>   at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
>   at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
>   at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>   at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>   at org.apache.spark.sql.SQLContext.(SQLContext.scala:271)
>   at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:97)
>   at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>   at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>   at org.apache.spark.repl.Main$.createSQLContext(Main.scala:89)
>   ... 47 elided
> Caused by: java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>   at
> 

Re: spark 1.6.0 connect to hive metastore

2016-03-09 Thread Suniti Singh
hive 1.6.0 in embed mode doesn't connect to metastore --
https://issues.apache.org/jira/browse/SPARK-9686

https://forums.databricks.com/questions/6512/spark-160-not-able-to-connect-to-hive-metastore.html


On Wed, Mar 9, 2016 at 10:48 AM, Suniti Singh 
wrote:

> Hi,
>
> I am able to reproduce this error only when using spark 1.6.0 and hive
> 1.6.0. The hive-site.xml is in the classpath but somehow spark rejects the
> classpath search for hive-site.xml and start using the default metastore
> Derby.
>
> 16/03/09 10:37:52 INFO MetaStoreDirectSql: Using direct SQL, underlying DB
> is DERBY
>
> 16/03/09 10:37:52 INFO ObjectStore: Initialized ObjectStore
>
> 16/03/09 10:37:52 WARN Hive: Failed to access metastore. This class should
> not accessed in runtime.
>
> org.apache.hadoop.hive.ql.metadata.HiveException:
> java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>
> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
>
> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
>
> at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:166)
>
> at org.apache.hadoop.hive.ql.session.SessionState.start(
> SessionState.java:503)
> at org.apache.spark.sql.hive.client.ClientWrapper.(
> ClientWrapper.scala:194)
>
> On Wed, Mar 9, 2016 at 9:00 AM, Dave Maughan 
> wrote:
>
>> Hi,
>>
>> We're having a similar issue. We have a standalone cluster running 1.5.2
>> with Hive working fine having dropped hive-site.xml into the conf folder.
>> We've just updated to 1.6.0, using the same configuration. Now when
>> starting a spark-shell we get the following:
>>
>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreCli
>> ent
>> at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>> at
>> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>> at
>> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>> at
>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>> at
>> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>> at
>> org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
>> at
>> org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
>> at
>> org.apache.spark.sql.UDFRegistration.(UDFRegistration.scala:40)
>> at org.apache.spark.sql.SQLContext.(SQLContext.scala:330)
>> at
>> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:90)
>> at
>> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> at
>> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>> at $iwC$$iwC.(:15)
>> at $iwC.(:24)
>> at (:26)
>>
>> On stepping though the code and enabling debug it shows that
>> hive.metastore.uris is not set:
>>
>> DEBUG ClientWrapper: Hive Config: hive.metastore.uris=
>>
>> ..So it looks like it's not finding hive-site.xml? Weirdly, if I remove
>> hive-site.xml the exception does not occur which implies that it WAS on the
>> classpath...
>>
>> Dave
>>
>>
>>
>> On Tue, 9 Feb 2016 at 22:26 Koert Kuipers  wrote:
>>
>>> i do not have phoenix, but i wonder if its something related. will check
>>> my classpaths
>>>
>>> On Tue, Feb 9, 2016 at 5:00 PM, Benjamin Kim  wrote:
>>>
 I got the same problem when I added the Phoenix plugin jar in the
 driver and executor extra classpaths. Do you have those set too?

>>>
 On Feb 9, 2016, at 1:12 PM, Koert Kuipers  wrote:

 yes its not using derby i think: i can see the tables in my actual hive
 metastore.

 i was using a symlink to /etc/hive/conf/hive-site.xml for my
 hive-site.xml which has a lot more stuff than just hive.metastore.uris

 let me try your approach



 On Tue, Feb 9, 2016 at 3:57 PM, Alexandr Dzhagriev 
 wrote:

> I'm using spark 1.6.0, hive 1.2.1 and there is just one property in
> the hive-site.xml hive.metastore.uris Works for me. Can you check in
> the logs, that when the HiveContext is created it connects to the correct
> uri and doesn't use derby.
>
> Cheers, Alex.
>
> On Tue, Feb 9, 2016 at 9:39 PM, Koert Kuipers 

Re: spark 1.6.0 connect to hive metastore

2016-03-09 Thread Suniti Singh
Hi,

I am able to reproduce this error only when using spark 1.6.0 and hive
1.6.0. The hive-site.xml is in the classpath but somehow spark rejects the
classpath search for hive-site.xml and start using the default metastore
Derby.

16/03/09 10:37:52 INFO MetaStoreDirectSql: Using direct SQL, underlying DB
is DERBY

16/03/09 10:37:52 INFO ObjectStore: Initialized ObjectStore

16/03/09 10:37:52 WARN Hive: Failed to access metastore. This class should
not accessed in runtime.

org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:
Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)

at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)

at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:166)

at org.apache.hadoop.hive.ql.session.SessionState.start(
SessionState.java:503)
at org.apache.spark.sql.hive.client.ClientWrapper.(
ClientWrapper.scala:194)

On Wed, Mar 9, 2016 at 9:00 AM, Dave Maughan 
wrote:

> Hi,
>
> We're having a similar issue. We have a standalone cluster running 1.5.2
> with Hive working fine having dropped hive-site.xml into the conf folder.
> We've just updated to 1.6.0, using the same configuration. Now when
> starting a spark-shell we get the following:
>
> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreCli
> ent
> at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
> at
> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
> at
> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
> at
> org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
> at
> org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
> at
> org.apache.spark.sql.UDFRegistration.(UDFRegistration.scala:40)
> at org.apache.spark.sql.SQLContext.(SQLContext.scala:330)
> at
> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:90)
> at
> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at
> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
> at $iwC$$iwC.(:15)
> at $iwC.(:24)
> at (:26)
>
> On stepping though the code and enabling debug it shows that
> hive.metastore.uris is not set:
>
> DEBUG ClientWrapper: Hive Config: hive.metastore.uris=
>
> ..So it looks like it's not finding hive-site.xml? Weirdly, if I remove
> hive-site.xml the exception does not occur which implies that it WAS on the
> classpath...
>
> Dave
>
>
>
> On Tue, 9 Feb 2016 at 22:26 Koert Kuipers  wrote:
>
>> i do not have phoenix, but i wonder if its something related. will check
>> my classpaths
>>
>> On Tue, Feb 9, 2016 at 5:00 PM, Benjamin Kim  wrote:
>>
>>> I got the same problem when I added the Phoenix plugin jar in the driver
>>> and executor extra classpaths. Do you have those set too?
>>>
>>
>>> On Feb 9, 2016, at 1:12 PM, Koert Kuipers  wrote:
>>>
>>> yes its not using derby i think: i can see the tables in my actual hive
>>> metastore.
>>>
>>> i was using a symlink to /etc/hive/conf/hive-site.xml for my
>>> hive-site.xml which has a lot more stuff than just hive.metastore.uris
>>>
>>> let me try your approach
>>>
>>>
>>>
>>> On Tue, Feb 9, 2016 at 3:57 PM, Alexandr Dzhagriev 
>>> wrote:
>>>
 I'm using spark 1.6.0, hive 1.2.1 and there is just one property in the
 hive-site.xml hive.metastore.uris Works for me. Can you check in the
 logs, that when the HiveContext is created it connects to the correct uri
 and doesn't use derby.

 Cheers, Alex.

 On Tue, Feb 9, 2016 at 9:39 PM, Koert Kuipers 
 wrote:

> hey thanks. hive-site is on classpath in conf directory
>
> i currently got it to work by changing this hive setting in
> hive-site.xml:
> hive.metastore.schema.verification=true
> to
> hive.metastore.schema.verification=false
>
> this feels like a hack, because schema verification is a good thing i
> would assume?
>
> On Tue, Feb 9, 2016 at 3:25 

Re: spark 1.6.0 connect to hive metastore

2016-03-09 Thread Dave Maughan
Hi,

We're having a similar issue. We have a standalone cluster running 1.5.2
with Hive working fine having dropped hive-site.xml into the conf folder.
We've just updated to 1.6.0, using the same configuration. Now when
starting a spark-shell we get the following:

java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreCli
ent
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at
org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
at
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
at
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
at
org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
at
org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
at
org.apache.spark.sql.UDFRegistration.(UDFRegistration.scala:40)
at org.apache.spark.sql.SQLContext.(SQLContext.scala:330)
at
org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:90)
at
org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at
org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.(:15)
at $iwC.(:24)
at (:26)

On stepping though the code and enabling debug it shows that
hive.metastore.uris is not set:

DEBUG ClientWrapper: Hive Config: hive.metastore.uris=

..So it looks like it's not finding hive-site.xml? Weirdly, if I remove
hive-site.xml the exception does not occur which implies that it WAS on the
classpath...

Dave



On Tue, 9 Feb 2016 at 22:26 Koert Kuipers  wrote:

> i do not have phoenix, but i wonder if its something related. will check
> my classpaths
>
> On Tue, Feb 9, 2016 at 5:00 PM, Benjamin Kim  wrote:
>
>> I got the same problem when I added the Phoenix plugin jar in the driver
>> and executor extra classpaths. Do you have those set too?
>>
>
>> On Feb 9, 2016, at 1:12 PM, Koert Kuipers  wrote:
>>
>> yes its not using derby i think: i can see the tables in my actual hive
>> metastore.
>>
>> i was using a symlink to /etc/hive/conf/hive-site.xml for my
>> hive-site.xml which has a lot more stuff than just hive.metastore.uris
>>
>> let me try your approach
>>
>>
>>
>> On Tue, Feb 9, 2016 at 3:57 PM, Alexandr Dzhagriev 
>> wrote:
>>
>>> I'm using spark 1.6.0, hive 1.2.1 and there is just one property in the
>>> hive-site.xml hive.metastore.uris Works for me. Can you check in the
>>> logs, that when the HiveContext is created it connects to the correct uri
>>> and doesn't use derby.
>>>
>>> Cheers, Alex.
>>>
>>> On Tue, Feb 9, 2016 at 9:39 PM, Koert Kuipers  wrote:
>>>
 hey thanks. hive-site is on classpath in conf directory

 i currently got it to work by changing this hive setting in
 hive-site.xml:
 hive.metastore.schema.verification=true
 to
 hive.metastore.schema.verification=false

 this feels like a hack, because schema verification is a good thing i
 would assume?

 On Tue, Feb 9, 2016 at 3:25 PM, Alexandr Dzhagriev 
 wrote:

> Hi Koert,
>
> As far as I can see you are using derby:
>
>  Using direct SQL, underlying DB is DERBY
>
> not mysql, which is used for the metastore. That means, spark couldn't
> find hive-site.xml on your classpath. Can you check that, please?
>
> Thanks, Alex.
>
> On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers 
> wrote:
>
>> has anyone successfully connected to hive metastore using spark
>> 1.6.0? i am having no luck. worked fine with spark 1.5.1 for me. i am on
>> cdh 5.5 and launching spark with yarn.
>>
>> this is what i see in logs:
>> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
>> with URI thrift://metastore.mycompany.com:9083
>> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>>
>> and then a little later:
>>
>> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
>> version 1.2.1
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop
>> version: 2.6.0-cdh5.4.4
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
>> 

spark 1.6.0 connect to hive metastore

2016-02-09 Thread Koert Kuipers
has anyone successfully connected to hive metastore using spark 1.6.0? i am
having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
launching spark with yarn.

this is what i see in logs:
16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore with
URI thrift://metastore.mycompany.com:9083
16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.

and then a little later:

16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
version 1.2.1
16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
2.6.0-cdh5.4.4
16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
hive.server2.enable.impersonation does not exist
16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize called
16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
present in CLASSPATH (or one of dependencies)
16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
present in CLASSPATH (or one of dependencies)
16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
hive.server2.enable.impersonation does not exist
16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object pin
classes with
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
underlying DB is DERBY
16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
  at
org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
  at
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
  at
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
  at
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
  at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
  at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
  at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
  at scala.collection.Iterator$class.foreach(Iterator.scala:742)
  at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
  at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
  at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
  at org.apache.spark.sql.SQLContext.(SQLContext.scala:271)
  at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:97)
  at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.spark.repl.Main$.createSQLContext(Main.scala:89)
  ... 47 elided
Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
  at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
  at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
  at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
  at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
  at 

Re: spark 1.6.0 connect to hive metastore

2016-02-09 Thread Koert Kuipers
hey thanks. hive-site is on classpath in conf directory

i currently got it to work by changing this hive setting in hive-site.xml:
hive.metastore.schema.verification=true
to
hive.metastore.schema.verification=false

this feels like a hack, because schema verification is a good thing i would
assume?

On Tue, Feb 9, 2016 at 3:25 PM, Alexandr Dzhagriev  wrote:

> Hi Koert,
>
> As far as I can see you are using derby:
>
>  Using direct SQL, underlying DB is DERBY
>
> not mysql, which is used for the metastore. That means, spark couldn't
> find hive-site.xml on your classpath. Can you check that, please?
>
> Thanks, Alex.
>
> On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers  wrote:
>
>> has anyone successfully connected to hive metastore using spark 1.6.0? i
>> am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
>> launching spark with yarn.
>>
>> this is what i see in logs:
>> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
>> with URI thrift://metastore.mycompany.com:9083
>> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>>
>> and then a little later:
>>
>> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
>> version 1.2.1
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
>> 2.6.0-cdh5.4.4
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
>> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
>> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
>> hive.server2.enable.impersonation does not exist
>> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store with
>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
>> called
>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>> hive.metastore.integral.jdo.pushdown unknown - will be ignored
>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>> datanucleus.cache.level2 unknown - will be ignored
>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>> present in CLASSPATH (or one of dependencies)
>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>> present in CLASSPATH (or one of dependencies)
>> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
>> hive.server2.enable.impersonation does not exist
>> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object
>> pin classes with
>> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
>> underlying DB is DERBY
>> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>   at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>>   at
>> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>>   at
>> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>>   at
>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>>   at
>> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>>   at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
>>   at
>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
>>   at
>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
>>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>>   at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>>   at org.apache.spark.sql.SQLContext.(SQLContext.scala:271)
>>   at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:97)
>>   at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>   at
>> 

Re: spark 1.6.0 connect to hive metastore

2016-02-09 Thread Alexandr Dzhagriev
Hi Koert,

As far as I can see you are using derby:

 Using direct SQL, underlying DB is DERBY

not mysql, which is used for the metastore. That means, spark couldn't find
hive-site.xml on your classpath. Can you check that, please?

Thanks, Alex.

On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers  wrote:

> has anyone successfully connected to hive metastore using spark 1.6.0? i
> am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
> launching spark with yarn.
>
> this is what i see in logs:
> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore with
> URI thrift://metastore.mycompany.com:9083
> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>
> and then a little later:
>
> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
> version 1.2.1
> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
> 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
> called
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
> datanucleus.cache.level2 unknown - will be ignored
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object pin
> classes with
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
> underlying DB is DERBY
> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>   at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>   at
> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>   at
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>   at
> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>   at
> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>   at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
>   at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
>   at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>   at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>   at org.apache.spark.sql.SQLContext.(SQLContext.scala:271)
>   at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:97)
>   at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>   at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>   at org.apache.spark.repl.Main$.createSQLContext(Main.scala:89)
>   ... 47 elided
> Caused by: java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>   at
> 

Re: spark 1.6.0 connect to hive metastore

2016-02-09 Thread Alexandr Dzhagriev
I'm using spark 1.6.0, hive 1.2.1 and there is just one property in the
hive-site.xml hive.metastore.uris Works for me. Can you check in the logs,
that when the HiveContext is created it connects to the correct uri and
doesn't use derby.

Cheers, Alex.

On Tue, Feb 9, 2016 at 9:39 PM, Koert Kuipers  wrote:

> hey thanks. hive-site is on classpath in conf directory
>
> i currently got it to work by changing this hive setting in hive-site.xml:
> hive.metastore.schema.verification=true
> to
> hive.metastore.schema.verification=false
>
> this feels like a hack, because schema verification is a good thing i
> would assume?
>
> On Tue, Feb 9, 2016 at 3:25 PM, Alexandr Dzhagriev 
> wrote:
>
>> Hi Koert,
>>
>> As far as I can see you are using derby:
>>
>>  Using direct SQL, underlying DB is DERBY
>>
>> not mysql, which is used for the metastore. That means, spark couldn't
>> find hive-site.xml on your classpath. Can you check that, please?
>>
>> Thanks, Alex.
>>
>> On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers  wrote:
>>
>>> has anyone successfully connected to hive metastore using spark 1.6.0? i
>>> am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
>>> launching spark with yarn.
>>>
>>> this is what i see in logs:
>>> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
>>> with URI thrift://metastore.mycompany.com:9083
>>> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>>>
>>> and then a little later:
>>>
>>> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
>>> version 1.2.1
>>> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
>>> 2.6.0-cdh5.4.4
>>> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
>>> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
>>> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
>>> hive.server2.enable.impersonation does not exist
>>> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store
>>> with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>>> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
>>> called
>>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>>> hive.metastore.integral.jdo.pushdown unknown - will be ignored
>>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>>> datanucleus.cache.level2 unknown - will be ignored
>>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>>> present in CLASSPATH (or one of dependencies)
>>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>>> present in CLASSPATH (or one of dependencies)
>>> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
>>> hive.server2.enable.impersonation does not exist
>>> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object
>>> pin classes with
>>> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
>>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
>>> underlying DB is DERBY
>>> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
>>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>>> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>>   at
>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>>>   at
>>> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>>>   at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>>>   at
>>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>>>   at
>>> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>>>   at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
>>>   at
>>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
>>>   at
>>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
>>>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>>>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>>>   at 

Re: spark 1.6.0 connect to hive metastore

2016-02-09 Thread Benjamin Kim
I got the same problem when I added the Phoenix plugin jar in the driver and 
executor extra classpaths. Do you have those set too?

> On Feb 9, 2016, at 1:12 PM, Koert Kuipers  wrote:
> 
> yes its not using derby i think: i can see the tables in my actual hive 
> metastore.
> 
> i was using a symlink to /etc/hive/conf/hive-site.xml for my hive-site.xml 
> which has a lot more stuff than just hive.metastore.uris
> 
> let me try your approach
> 
> 
> 
> On Tue, Feb 9, 2016 at 3:57 PM, Alexandr Dzhagriev  > wrote:
> I'm using spark 1.6.0, hive 1.2.1 and there is just one property in the 
> hive-site.xml hive.metastore.uris Works for me. Can you check in the logs, 
> that when the HiveContext is created it connects to the correct uri and 
> doesn't use derby.
> 
> Cheers, Alex.
> 
> On Tue, Feb 9, 2016 at 9:39 PM, Koert Kuipers  > wrote:
> hey thanks. hive-site is on classpath in conf directory
> 
> i currently got it to work by changing this hive setting in hive-site.xml:
> hive.metastore.schema.verification=true
> to
> hive.metastore.schema.verification=false
> 
> this feels like a hack, because schema verification is a good thing i would 
> assume?
> 
> On Tue, Feb 9, 2016 at 3:25 PM, Alexandr Dzhagriev  > wrote:
> Hi Koert,
> 
> As far as I can see you are using derby:
> 
>  Using direct SQL, underlying DB is DERBY
> 
> not mysql, which is used for the metastore. That means, spark couldn't find 
> hive-site.xml on your classpath. Can you check that, please?
> 
> Thanks, Alex.
> 
> On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers  > wrote:
> has anyone successfully connected to hive metastore using spark 1.6.0? i am 
> having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and 
> launching spark with yarn.
> 
> this is what i see in logs:
> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore with 
> URI thrift://metastore.mycompany.com:9083 
> 
> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
> 
> and then a little later:
> 
> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive, version 
> 1.2.1
> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version: 
> 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded 
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name 
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store with 
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize called
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property 
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property 
> datanucleus.cache.level2 unknown - will be ignored
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not 
> present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not 
> present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name 
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object pin 
> classes with 
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class 
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as 
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class 
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" 
> so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class 
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as 
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class 
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" 
> so does not have its own datastore table.
> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL, 
> underlying DB is DERBY
> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
> java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>   at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>   at 
> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>   at 
> 

Re: spark 1.6.0 connect to hive metastore

2016-02-09 Thread Koert Kuipers
yes its not using derby i think: i can see the tables in my actual hive
metastore.

i was using a symlink to /etc/hive/conf/hive-site.xml for my hive-site.xml
which has a lot more stuff than just hive.metastore.uris

let me try your approach



On Tue, Feb 9, 2016 at 3:57 PM, Alexandr Dzhagriev  wrote:

> I'm using spark 1.6.0, hive 1.2.1 and there is just one property in the
> hive-site.xml hive.metastore.uris Works for me. Can you check in the
> logs, that when the HiveContext is created it connects to the correct uri
> and doesn't use derby.
>
> Cheers, Alex.
>
> On Tue, Feb 9, 2016 at 9:39 PM, Koert Kuipers  wrote:
>
>> hey thanks. hive-site is on classpath in conf directory
>>
>> i currently got it to work by changing this hive setting in hive-site.xml:
>> hive.metastore.schema.verification=true
>> to
>> hive.metastore.schema.verification=false
>>
>> this feels like a hack, because schema verification is a good thing i
>> would assume?
>>
>> On Tue, Feb 9, 2016 at 3:25 PM, Alexandr Dzhagriev 
>> wrote:
>>
>>> Hi Koert,
>>>
>>> As far as I can see you are using derby:
>>>
>>>  Using direct SQL, underlying DB is DERBY
>>>
>>> not mysql, which is used for the metastore. That means, spark couldn't
>>> find hive-site.xml on your classpath. Can you check that, please?
>>>
>>> Thanks, Alex.
>>>
>>> On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers  wrote:
>>>
 has anyone successfully connected to hive metastore using spark 1.6.0?
 i am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5
 and launching spark with yarn.

 this is what i see in logs:
 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
 with URI thrift://metastore.mycompany.com:9083
 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.

 and then a little later:

 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
 version 1.2.1
 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
 2.6.0-cdh5.4.4
 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
 org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 
 2.6.0-cdh5.4.4
 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
 hive.server2.enable.impersonation does not exist
 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store
 with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
 called
 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
 hive.metastore.integral.jdo.pushdown unknown - will be ignored
 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
 datanucleus.cache.level2 unknown - will be ignored
 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
 present in CLASSPATH (or one of dependencies)
 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
 present in CLASSPATH (or one of dependencies)
 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
 hive.server2.enable.impersonation does not exist
 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object
 pin classes with
 hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
 "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
 "embedded-only" so does not have its own datastore table.
 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
 "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
 "embedded-only" so does not have its own datastore table.
 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
 "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
 "embedded-only" so does not have its own datastore table.
 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
 "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
 "embedded-only" so does not have its own datastore table.
 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
 underlying DB is DERBY
 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
 java.lang.RuntimeException: java.lang.RuntimeException: Unable to
 instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
   at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
   at
 org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
   at
 org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
   at
 org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
   at
 

Re: spark 1.6.0 connect to hive metastore

2016-02-09 Thread Jagat Singh
Hi,

I am using by telling Spark about hive version we are using. This is done
by setting following properties

spark.sql.hive.version
spark.sql.hive.metastore.jars

Thanks


On Wed, Feb 10, 2016 at 7:39 AM, Koert Kuipers  wrote:

> hey thanks. hive-site is on classpath in conf directory
>
> i currently got it to work by changing this hive setting in hive-site.xml:
> hive.metastore.schema.verification=true
> to
> hive.metastore.schema.verification=false
>
> this feels like a hack, because schema verification is a good thing i
> would assume?
>
> On Tue, Feb 9, 2016 at 3:25 PM, Alexandr Dzhagriev 
> wrote:
>
>> Hi Koert,
>>
>> As far as I can see you are using derby:
>>
>>  Using direct SQL, underlying DB is DERBY
>>
>> not mysql, which is used for the metastore. That means, spark couldn't
>> find hive-site.xml on your classpath. Can you check that, please?
>>
>> Thanks, Alex.
>>
>> On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers  wrote:
>>
>>> has anyone successfully connected to hive metastore using spark 1.6.0? i
>>> am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
>>> launching spark with yarn.
>>>
>>> this is what i see in logs:
>>> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
>>> with URI thrift://metastore.mycompany.com:9083
>>> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>>>
>>> and then a little later:
>>>
>>> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
>>> version 1.2.1
>>> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
>>> 2.6.0-cdh5.4.4
>>> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
>>> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
>>> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
>>> hive.server2.enable.impersonation does not exist
>>> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store
>>> with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>>> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
>>> called
>>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>>> hive.metastore.integral.jdo.pushdown unknown - will be ignored
>>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>>> datanucleus.cache.level2 unknown - will be ignored
>>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>>> present in CLASSPATH (or one of dependencies)
>>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>>> present in CLASSPATH (or one of dependencies)
>>> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
>>> hive.server2.enable.impersonation does not exist
>>> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object
>>> pin classes with
>>> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
>>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
>>> underlying DB is DERBY
>>> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
>>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>>> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>>   at
>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>>>   at
>>> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:194)
>>>   at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>>>   at
>>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>>>   at
>>> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>>>   at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
>>>   at
>>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
>>>   at
>>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
>>>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>>>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>>>   at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>>>   at 

Re: spark 1.6.0 connect to hive metastore

2016-02-09 Thread Koert Kuipers
i do not have phoenix, but i wonder if its something related. will check my
classpaths

On Tue, Feb 9, 2016 at 5:00 PM, Benjamin Kim  wrote:

> I got the same problem when I added the Phoenix plugin jar in the driver
> and executor extra classpaths. Do you have those set too?
>
> On Feb 9, 2016, at 1:12 PM, Koert Kuipers  wrote:
>
> yes its not using derby i think: i can see the tables in my actual hive
> metastore.
>
> i was using a symlink to /etc/hive/conf/hive-site.xml for my hive-site.xml
> which has a lot more stuff than just hive.metastore.uris
>
> let me try your approach
>
>
>
> On Tue, Feb 9, 2016 at 3:57 PM, Alexandr Dzhagriev 
> wrote:
>
>> I'm using spark 1.6.0, hive 1.2.1 and there is just one property in the
>> hive-site.xml hive.metastore.uris Works for me. Can you check in the
>> logs, that when the HiveContext is created it connects to the correct uri
>> and doesn't use derby.
>>
>> Cheers, Alex.
>>
>> On Tue, Feb 9, 2016 at 9:39 PM, Koert Kuipers  wrote:
>>
>>> hey thanks. hive-site is on classpath in conf directory
>>>
>>> i currently got it to work by changing this hive setting in
>>> hive-site.xml:
>>> hive.metastore.schema.verification=true
>>> to
>>> hive.metastore.schema.verification=false
>>>
>>> this feels like a hack, because schema verification is a good thing i
>>> would assume?
>>>
>>> On Tue, Feb 9, 2016 at 3:25 PM, Alexandr Dzhagriev 
>>> wrote:
>>>
 Hi Koert,

 As far as I can see you are using derby:

  Using direct SQL, underlying DB is DERBY

 not mysql, which is used for the metastore. That means, spark couldn't
 find hive-site.xml on your classpath. Can you check that, please?

 Thanks, Alex.

 On Tue, Feb 9, 2016 at 8:58 PM, Koert Kuipers 
 wrote:

> has anyone successfully connected to hive metastore using spark 1.6.0?
> i am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5
> and launching spark with yarn.
>
> this is what i see in logs:
> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
> with URI thrift://metastore.mycompany.com:9083
> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>
> and then a little later:
>
> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
> version 1.2.1
> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
> 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 
> 2.6.0-cdh5.4.4
> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store
> with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
> called
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
> datanucleus.cache.level2 unknown - will be ignored
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but
> not present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but
> not present in CLASSPATH (or one of dependencies)
> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
> hive.server2.enable.impersonation does not exist
> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object
> pin classes with
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
> underlying DB is DERBY
> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
> instantiate