No suitable driver found error, Create table in hive from spark sql.

I am trying to execute following example.
SPARKGIT:
spark/examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala

My setup :- hadoop 1.6,spark 1.2, hive 1.0, mysql server (installed via yum
install mysql55w mysql55w-server)

I can create tables in hive from hive command prompt.
/
hive> select * from person_parquet;
OK
Barack  Obama   M
Bill    Clinton M
Hillary Clinton F
Time taken: 1.945 seconds, Fetched: 3 row(s)
/

I am starting spark shell via following command:-

./spark-1.2.0-bin-hadoop2.4/bin/spark-shell --master
spark://sparkmaster.company.com:7077 --jars
/data/mysql-connector-java-5.1.14-bin.jar

/scala> Class.forName("com.mysql.jdbc.Driver")
res0: Class[_] = class com.mysql.jdbc.Driver

scala> Class.forName("com.mysql.jdbc.Driver").newInstance
res1: Any = com.mysql.jdbc.Driver@2dec8e27

scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@32ecf100

scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
STRING)")
15/02/18 22:23:01 INFO parse.ParseDriver: Parsing command: CREATE TABLE IF
NOT EXISTS src (key INT, value STRING)
15/02/18 22:23:02 INFO parse.ParseDriver: Parse Completed
15/02/18 22:23:02 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/02/18 22:23:02 INFO metastore.ObjectStore: ObjectStore, initialize called
15/02/18 22:23:02 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
15/02/18 22:23:02 INFO DataNucleus.Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/02/18 22:23:02 WARN DataNucleus.Connection: BoneCP specified but not
present in CLASSPATH (or one of dependencies)
15/02/18 22:23:02 WARN DataNucleus.Connection: BoneCP specified but not
present in CLASSPATH (or one of dependencies)
15/02/18 22:23:02 ERROR Datastore.Schema: Failed initialising database.
No suitable driver found for jdbc:mysql://sparkmaster.company.com:3306/hive
org.datanucleus.exceptions.NucleusDataStoreException: No suitable driver
found for jdbc:mysql://sparkmaster.company.com:3306/hive
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)
        at
org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
        at
org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
        at
org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
        at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
        at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
        at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
        at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:235)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:231)
        at scala.Option.orElse(Option.scala:257)
        at
org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231)
        at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:54)
        at
org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext$$anon$4.<init>(HiveContext.scala:263)
        at
org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:263)
        at
org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:262)
        at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:411)
        at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:411)
        at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
        at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
        at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
        at $line11.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
        at $line11.$read$$iwC$$iwC$$iwC.<init>(<console>:20)
        at $line11.$read$$iwC$$iwC.<init>(<console>:22)
        at $line11.$read$$iwC.<init>(<console>:24)
        at $line11.$read.<init>(<console>:26)
        at $line11.$read$.<init>(<console>:30)
        at $line11.$read$.<clinit>(<console>)
        at $line11.$eval$.<init>(<console>:7)
        at $line11.$eval$.<clinit>(<console>)
        at $line11.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
        at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
        at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
        at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
        at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
        at
org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
        at
org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
        at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.sql.SQLException: No suitable driver found for
jdbc:mysql://sparkmaster.company.com:3306/hive
        at java.sql.DriverManager.getConnection(DriverManager.java:596)
        at java.sql.DriverManager.getConnection(DriverManager.java:187)
        at
org.datanucleus.store.rdbms.datasource.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:78)
        at
org.datanucleus.store.rdbms.datasource.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
        at
org.datanucleus.store.rdbms.datasource.dbcp.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1158)
        at
org.datanucleus.store.rdbms.datasource.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:108)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
        ... 108 more
Nested Throwables StackTrace:
java.sql.SQLException: No suitable driver found for
jdbc:mysql://sparkmaster.company.com:3306/hive
        at java.sql.DriverManager.getConnection(DriverManager.java:596)
        at java.sql.DriverManager.getConnection(DriverManager.java:187)
        at
org.datanucleus.store.rdbms.datasource.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:78)
        at
org.datanucleus.store.rdbms.datasource.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
        at
org.datanucleus.store.rdbms.datasource.dbcp.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1158)
        at
org.datanucleus.store.rdbms.datasource.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:108)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
        at
org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
        at
org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
        at
org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
        at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
        at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
        at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
        at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:235)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:231)
        at scala.Option.orElse(Option.scala:257)
        at
org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231)
        at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:54)
        at
org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext$$anon$4.<init>(HiveContext.scala:263)
        at
org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:263)
        at
org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:262)
        at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:411)
        at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:411)
        at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
        at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
        at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
        at $line11.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
        at $line11.$read$$iwC$$iwC$$iwC.<init>(<console>:20)
        at $line11.$read$$iwC$$iwC.<init>(<console>:22)
        at $line11.$read$$iwC.<init>(<console>:24)
        at $line11.$read.<init>(<console>:26)
        at $line11.$read$.<init>(<console>:30)
        at $line11.$read$.<clinit>(<console>)
        at $line11.$eval$.<init>(<console>:7)
        at $line11.$eval$.<clinit>(<console>)
        at $line11.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
        at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
        at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
        at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
        at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
        at
org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
        at
org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
        at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:235)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:231)
        at scala.Option.orElse(Option.scala:257)
        at
org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231)
        at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:54)
        at
org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:253)
        at
org.apache.spark.sql.hive.HiveContext$$anon$4.<init>(HiveContext.scala:263)
        at
org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:263)
        at
org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:262)
        at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:411)
        at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:411)
        at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
        at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
        at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
        at $iwC$$iwC$$iwC.<init>(<console>:20)
        at $iwC$$iwC.<init>(<console>:22)
        at $iwC.<init>(<console>:24)
        at <init>(<console>:26)
        at .<init>(<console>:30)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
        at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
        at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
        at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
        at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
        at
org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
        at
org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
        at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
        ... 59 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
        ... 64 more
Caused by: javax.jdo.JDOFatalDataStoreException: No suitable driver found
for jdbc:mysql://sparkmaster.company.com:3306/hive
NestedThrowables:
java.sql.SQLException: No suitable driver found for
jdbc:mysql://sparkmaster.company.com:3306/hive
        at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:436)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
        at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
        at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
        at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
        at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
        ... 69 more
Caused by: java.sql.SQLException: No suitable driver found for
jdbc:mysql://sparkmaster.company.com:3306/hive
        at java.sql.DriverManager.getConnection(DriverManager.java:596)
        at java.sql.DriverManager.getConnection(DriverManager.java:187)
        at
org.datanucleus.store.rdbms.datasource.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:78)
        at
org.datanucleus.store.rdbms.datasource.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
        at
org.datanucleus.store.rdbms.datasource.dbcp.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1158)
        at
org.datanucleus.store.rdbms.datasource.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:108)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
        at
org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
        at
org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
        at
org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
        ... 98 more/




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/No-suitable-driver-found-error-Create-table-in-hive-from-spark-sql-tp21714.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to