[ 
https://issues.apache.org/jira/browse/HIVE-22141?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16916024#comment-16916024
 ] 

Rajkumar Singh commented on HIVE-22141:
---------------------------------------

it seems that you are using embedded MS with spark context and it failed in the 
absence of derby driver
{code}
java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.derby.jdbc.AutoloadedDriver40
{code}

> Change to an existing code has resulted in some errors
> ------------------------------------------------------
>
>                 Key: HIVE-22141
>                 URL: https://issues.apache.org/jira/browse/HIVE-22141
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Priya
>            Priority: Minor
>
> I have made a minor code modification to a scala code that was runing ok. But 
> after this I am running into multiple erros, i feel cold be something related 
> to hivemetastore etc. Please advise.
>  
> 19/08/21 13:44:32 WARN HiveConf: HiveConf of name hive.metastore.dml.events 
> does not exist19/08/21 13:44:32 WARN HiveConf: HiveConf of name 
> hive.metastore.dml.events does not exist19/08/21 13:44:32 WARN HiveConf: 
> HiveConf of name hive.log.every.n.records does not exist19/08/21 13:44:32 
> WARN HiveConf: HiveConf of name hive.explain.user does not exist19/08/21 
> 13:44:32 WARN HiveConf: HiveConf of name hive.cbo.costmodel.hdfs.write does 
> not exist19/08/21 13:44:32 WARN HiveConf: HiveConf of name 
> hive.cbo.costmodel.local.fs.write does not exist19/08/21 13:44:32 WARN 
> HiveMetaStore: Retrying creating default database after error: Unexpected 
> exception caught.javax.jdo.JDOFatalInternalException: Unexpected exception 
> caught. at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
>  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:406) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:435)
>  at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:330) 
> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:286) 
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:596)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:574)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:623)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:464)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5781)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:197)
>  at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1490)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
>  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2915) 
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2934) at 
> org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3159) at 
> org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:206) at 
> org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:193)
>  at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:303) at 
> org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:264) at 
> org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:239) at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:493) 
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
>  at 
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
>  at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237) 
> at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271) at 
> scala.collection.Iterator$class.foreach(Iterator.scala:727) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at 
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at 
> scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
> org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<init>(SparkConf.scala:27) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<clinit>(SparkConf.scala) at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.getViewingCards(Mesh_Base_Viewing_Card.scala:22)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.$init$(Mesh_Base_Viewing_Card.scala:12)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Device_Dim.<init>(Mesh_Device_Dim.scala:37)
>  at 
> com.sky.strategic_data.skyq.JobRunner$.matchMeshAPITable(JobRunner.scala:120) 
> at com.sky.strategic_data.skyq.JobRunner$.main(JobRunner.scala:51) at 
> com.sky.strategic_data.skyq.JobRunner.main(JobRunner.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)NestedThrowablesStackTrace:java.lang.reflect.InvocationTargetException
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:406) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:435)
>  at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:330) 
> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:286) 
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:596)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:574)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:623)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:464)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5781)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:197)
>  at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1490)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
>  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2915) 
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2934) at 
> org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3159) at 
> org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:206) at 
> org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:193)
>  at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:303) at 
> org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:264) at 
> org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:239) at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:493) 
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
>  at 
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
>  at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237) 
> at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271) at 
> scala.collection.Iterator$class.foreach(Iterator.scala:727) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at 
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at 
> scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
> org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<init>(SparkConf.scala:27) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<clinit>(SparkConf.scala) at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.getViewingCards(Mesh_Base_Viewing_Card.scala:22)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.$init$(Mesh_Base_Viewing_Card.scala:12)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Device_Dim.<init>(Mesh_Device_Dim.scala:37)
>  at 
> com.sky.strategic_data.skyq.JobRunner$.matchMeshAPITable(JobRunner.scala:120) 
> at com.sky.strategic_data.skyq.JobRunner$.main(JobRunner.scala:51) at 
> com.sky.strategic_data.skyq.JobRunner.main(JobRunner.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: 
> java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.derby.jdbc.AutoloadedDriver40 at java.lang.Class.forName0(Native 
> Method) at java.lang.Class.forName(Class.java:348) at 
> java.sql.DriverManager.isDriverAllowed(DriverManager.java:556) at 
> java.sql.DriverManager.getConnection(DriverManager.java:661) at 
> java.sql.DriverManager.getConnection(DriverManager.java:208) at 
> com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:349) at 
> com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) 
> at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
>  at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>  at 
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>  at 
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
>  at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356) at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
>  ... 78 more19/08/21 13:44:32 WARN Hive: Failed to register all 
> functions.java.lang.RuntimeException: Unable to instantiate 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1492)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
>  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2915) 
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2934) at 
> org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3159) at 
> org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:206) at 
> org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:193)
>  at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:303) at 
> org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:264) at 
> org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:239) at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:493) 
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
>  at 
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
>  at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237) 
> at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271) at 
> scala.collection.Iterator$class.foreach(Iterator.scala:727) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at 
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at 
> scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
> org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<init>(SparkConf.scala:27) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<clinit>(SparkConf.scala) at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.getViewingCards(Mesh_Base_Viewing_Card.scala:22)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.$init$(Mesh_Base_Viewing_Card.scala:12)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Device_Dim.<init>(Mesh_Device_Dim.scala:37)
>  at 
> com.sky.strategic_data.skyq.JobRunner$.matchMeshAPITable(JobRunner.scala:120) 
> at com.sky.strategic_data.skyq.JobRunner$.main(JobRunner.scala:51) at 
> com.sky.strategic_data.skyq.JobRunner.main(JobRunner.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1490)
>  ... 46 moreCaused by: javax.jdo.JDOFatalInternalException: Unexpected 
> exception caught.NestedThrowables:java.lang.reflect.InvocationTargetException 
> at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
>  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:406) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:435)
>  at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:330) 
> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:286) 
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:596)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:574)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:627)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:464)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5781)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:197)
>  at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>  ... 51 moreCaused by: java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>  ... 70 moreCaused by: java.lang.NoClassDefFoundError: Could not initialize 
> class org.apache.derby.jdbc.AutoloadedDriver40 at 
> java.lang.Class.forName0(Native Method) at 
> java.lang.Class.forName(Class.java:348) at 
> java.sql.DriverManager.isDriverAllowed(DriverManager.java:556) at 
> java.sql.DriverManager.getConnection(DriverManager.java:661) at 
> java.sql.DriverManager.getConnection(DriverManager.java:208) at 
> com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:349) at 
> com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) 
> at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
>  at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>  at 
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>  at 
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
>  at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356) at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
>  ... 78 more19/08/21 13:44:32 WARN HiveMetaStore: Retrying creating default 
> database after error: Unexpected exception 
> caught.javax.jdo.JDOFatalInternalException: Unexpected exception caught. at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
>  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:406) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:435)
>  at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:330) 
> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:286) 
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:596)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:574)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:623)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:464)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5781)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:197)
>  at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1490)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
>  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2915) 
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2934) at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:493) 
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
>  at 
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
>  at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237) 
> at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271) at 
> scala.collection.Iterator$class.foreach(Iterator.scala:727) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at 
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at 
> scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
> org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<init>(SparkConf.scala:27) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<clinit>(SparkConf.scala) at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.getViewingCards(Mesh_Base_Viewing_Card.scala:22)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.$init$(Mesh_Base_Viewing_Card.scala:12)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Device_Dim.<init>(Mesh_Device_Dim.scala:37)
>  at 
> com.sky.strategic_data.skyq.JobRunner$.matchMeshAPITable(JobRunner.scala:120) 
> at com.sky.strategic_data.skyq.JobRunner$.main(JobRunner.scala:51) at 
> com.sky.strategic_data.skyq.JobRunner.main(JobRunner.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)NestedThrowablesStackTrace:java.lang.reflect.InvocationTargetException
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:406) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:435)
>  at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:330) 
> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:286) 
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:596)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:574)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:623)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:464)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5781)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:197)
>  at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1490)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
>  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2915) 
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2934) at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:493) 
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
>  at 
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
>  at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237) 
> at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271) at 
> scala.collection.Iterator$class.foreach(Iterator.scala:727) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at 
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at 
> scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
> org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<init>(SparkConf.scala:27) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<clinit>(SparkConf.scala) at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.getViewingCards(Mesh_Base_Viewing_Card.scala:22)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.$init$(Mesh_Base_Viewing_Card.scala:12)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Device_Dim.<init>(Mesh_Device_Dim.scala:37)
>  at 
> com.sky.strategic_data.skyq.JobRunner$.matchMeshAPITable(JobRunner.scala:120) 
> at com.sky.strategic_data.skyq.JobRunner$.main(JobRunner.scala:51) at 
> com.sky.strategic_data.skyq.JobRunner.main(JobRunner.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: 
> java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.derby.jdbc.AutoloadedDriver40 at java.lang.Class.forName0(Native 
> Method) at java.lang.Class.forName(Class.java:348) at 
> java.sql.DriverManager.isDriverAllowed(DriverManager.java:556) at 
> java.sql.DriverManager.getConnection(DriverManager.java:661) at 
> java.sql.DriverManager.getConnection(DriverManager.java:208) at 
> com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:349) at 
> com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) 
> at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
>  at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>  at 
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>  at 
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
>  at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356) at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
>  ... 72 moreException in thread "main" java.lang.ExceptionInInitializerError 
> at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.getViewingCards(Mesh_Base_Viewing_Card.scala:22)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Base_Viewing_Card$class.$init$(Mesh_Base_Viewing_Card.scala:12)
>  at 
> com.sky.strategic_data.skyq.source.mesh_api.Mesh_Device_Dim.<init>(Mesh_Device_Dim.scala:37)
>  at 
> com.sky.strategic_data.skyq.JobRunner$.matchMeshAPITable(JobRunner.scala:120) 
> at com.sky.strategic_data.skyq.JobRunner$.main(JobRunner.scala:51) at 
> com.sky.strategic_data.skyq.JobRunner.main(JobRunner.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
>  at 
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
>  at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237) 
> at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272) at 
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271) at 
> scala.collection.Iterator$class.foreach(Iterator.scala:727) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at 
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at 
> scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
> org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) at 
> org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<init>(SparkConf.scala:27) at 
> com.sky.strategic_data.skyq.conf.ETLConf$.<clinit>(SparkConf.scala) ... 15 
> moreCaused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable 
> to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 
> at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:512) 
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
>  ... 34 moreCaused by: java.lang.RuntimeException: Unable to instantiate 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1492)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
>  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
>  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2915) 
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2934) at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:493) 
> ... 35 moreCaused by: java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1490)
>  ... 40 moreCaused by: javax.jdo.JDOFatalInternalException: Unexpected 
> exception caught.NestedThrowables:java.lang.reflect.InvocationTargetException 
> at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
>  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:406) at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:435)
>  at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:330) 
> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:286) 
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56) 
> at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:596)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:574)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:627)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:464)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
>  at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5781)
>  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:197)
>  at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>  ... 45 moreCaused by: java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>  ... 64 moreCaused by: java.lang.NoClassDefFoundError: Could not initialize 
> class org.apache.derby.jdbc.AutoloadedDriver40 at 
> java.lang.Class.forName0(Native Method) at 
> java.lang.Class.forName(Class.java:348) at 
> java.sql.DriverManager.isDriverAllowed(DriverManager.java:556) at 
> java.sql.DriverManager.getConnection(DriverManager.java:661) at 
> java.sql.DriverManager.getConnection(DriverManager.java:208) at 
> com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:349) at 
> com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) 
> at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
>  at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>  at 
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>  at 
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
>  at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356) at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
>  at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

Reply via email to