[ 
https://issues.apache.org/jira/browse/SPARK-22660?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16276383#comment-16276383
 ] 

liyunzhang commented on SPARK-22660:
------------------------------------

when running spark sql on the above package, exception is thrown out
{code}
[root@bdpe41 spark-2.3.0-SNAPSHOT-bin-2.7.3]# ./bin/spark-shell 
spark-2.3.0-SNAPSHOT-bin-2.7.
^C[root@bdpe41 spark-2.3.0-SNAPSHOT-bin-2.7.3]# ./bin/spark-shell 
--driver-memory 1G
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by 
org.apache.hadoop.security.authentication.util.KerberosUtil 
(file:/home/zly/spark-2.3.0-SNAPSHOT-bin-2.7.3/jars/hadoop-auth-2.7.3.jar) to 
method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of 
org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal 
reflective access operations
WARNING: All illegal access operations will be denied in a future release
2017-12-05 03:03:23,511 WARN util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.3.0-SNAPSHOT
      /_/
         
Using Scala version 2.12.4 (Java HotSpot(TM) 64-Bit Server VM, Java 9.0.1)
Type in expressions to have them evaluated.
Type :help for more information.

scala> Spark context Web UI available at http://bdpe41:4040
Spark context available as 'sc' (master = local[*], app id = 
local-1512414208378).
Spark session available as 'spark'.
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
warning: there was one deprecation warning (since 2.0.0); for details, enable 
`:setting -deprecation' or `:replay -deprecation'
sqlContext: org.apache.spark.sql.SQLContext = 
org.apache.spark.sql.SQLContext@8da0e54

scala> import sqlContext.implicits._
import sqlContext.implicits._

scala> case class Customer(customer_id: Int, name: String, city: String, state: 
String, zip_code: String)
defined class Customer

scala> val dfCustomers = 
sc.textFile("/home/zly/spark-2.3.0-SNAPSHOT-bin-2.7.3/customers.txt").map(_.split(",")).map(p
 => Customer(p(0).trim.toInt, p(1), p(2), p(3), p(4))).toDF()

2017-12-05 03:04:02,647 WARN util.ClosureCleaner: Expected a closure; got 
org.apache.spark.SparkContext$$Lambda$2237/371823738
2017-12-05 03:04:02,649 WARN util.ClosureCleaner: Expected a closure; got 
org.apache.spark.SparkContext$$Lambda$2242/539107678
2017-12-05 03:04:02,651 WARN util.ClosureCleaner: Expected a closure; got 
$line20.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$Lambda$2245/345086812
2017-12-05 03:04:02,654 WARN util.ClosureCleaner: Expected a closure; got 
$line20.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$Lambda$2246/1829622584
2017-12-05 03:04:03,861 WARN metadata.Hive: Failed to access metastore. This 
class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: 
Unable to instantiate 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at 
org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
        at 
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
        at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
        at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        at 
org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
        at 
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at 
java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:488)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
        at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:383)
        at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:195)
        at 
scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:12)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:195)
        at 
org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:100)
        at 
org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:88)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
        at 
org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$build$2(BaseSessionStateBuilder.scala:293)
        at 
org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
        at 
org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
        at 
org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:70)
        at 
org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:68)
        at 
org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:51)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:168)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:174)
        at org.apache.spark.sql.Dataset$.apply(Dataset.scala:65)
        at 
org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:485)
        at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:392)
        at 
org.apache.spark.sql.SQLImplicits.rddToDatasetHolder(SQLImplicits.scala:220)
        at 
$line20.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:31)
        at 
$line20.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:35)
        at $line20.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:37)
        at $line20.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:39)
        at $line20.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:41)
        at $line20.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:43)
        at $line20.$read$$iw$$iw$$iw$$iw.<init>(<console>:45)
        at $line20.$read$$iw$$iw$$iw.<init>(<console>:47)
        at $line20.$read$$iw$$iw.<init>(<console>:49)
        at $line20.$read$$iw.<init>(<console>:51)
        at $line20.$read.<init>(<console>:53)
        at $line20.$read$.<init>(<console>:57)
        at $line20.$read$.<clinit>(<console>)
        at $line20.$eval$.$print$lzycompute(<console>:7)
        at $line20.$eval$.$print(<console>:6)
        at $line20.$eval.$print(<console>)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:564)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:733)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:997)
        at 
scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:565)
        at 
scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:34)
        at 
scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:30)
        at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:33)
        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:564)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:591)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:561)
        at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:869)
        at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:737)
        at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:455)
        at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:476)
        at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:1052)
        at org.apache.spark.repl.Main$.doMain(Main.scala:76)
        at org.apache.spark.repl.Main$.main(Main.scala:56)
        at org.apache.spark.repl.Main.main(Main.scala)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:564)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:843)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:188)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:218)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:127)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at 
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at 
org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
        ... 84 more
Caused by: java.lang.reflect.InvocationTargetException
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at 
java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:488)
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        ... 90 more
Caused by: java.lang.NoClassDefFoundError: java/sql/SQLException
        at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<clinit>(JDOPersistenceManagerFactory.java:108)
        at java.base/java.lang.Class.forName0(Native Method)
        at java.base/java.lang.Class.forName(Class.java:375)
        at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)
        at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)
        at 
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
        at 
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
        at 
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
        at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
        at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
        at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        ... 95 more
Caused by: java.lang.ClassNotFoundException: java.sql.SQLException
        at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:466)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:563)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:221)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:210)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:496)
        ... 122 more
org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: 
java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
  at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:107)
  at 
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:195)
  at 
org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:100)
  at 
org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:88)
  at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
  at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
  at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
  at 
org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
  at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
  at 
org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$build$2(BaseSessionStateBuilder.scala:293)
  at 
org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
  at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
  at 
org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:70)
  at 
org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:68)
  at 
org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:51)
  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:168)
  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:174)
  at org.apache.spark.sql.Dataset$.apply(Dataset.scala:65)
  at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:485)
  at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:392)
  at 
org.apache.spark.sql.SQLImplicits.rddToDatasetHolder(SQLImplicits.scala:220)
  ... 47 elided
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to 
instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
  at 
org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
  at 
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
  at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
  at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at 
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:488)
  at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
  at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:383)
  at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
  at 
org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
  at 
org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
  at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:195)
  at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:12)
  at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
  ... 67 more
Caused by: java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
  at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
  at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
  at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
  at 
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
  at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
  ... 81 more
Caused by: java.lang.reflect.InvocationTargetException: 
java.lang.NoClassDefFoundError: Could not initialize class 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory
  at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
  at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at 
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:488)
  at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
  ... 87 more
Caused by: java.lang.NoClassDefFoundError: Could not initialize class 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory
  at java.base/java.lang.Class.forName0(Native Method)
  at java.base/java.lang.Class.forName(Class.java:375)
  at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)
  at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)
  at java.base/java.security.AccessController.doPrivileged(Native Method)
  at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)
  at 
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)
  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
  at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
  at 
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
  at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
  at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
  at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
  at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
  at 
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
  at 
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
  at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
  at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
  at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
  at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
  at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
  at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
  at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
  at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
  at org.apach

{code}

> Compile with scala-2.12 and JDK9
> --------------------------------
>
>                 Key: SPARK-22660
>                 URL: https://issues.apache.org/jira/browse/SPARK-22660
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 2.2.0
>            Reporter: liyunzhang
>            Priority: Minor
>
> build with scala-2.12 with following steps
> 1. change the pom.xml with scala-2.12
>  ./dev/change-scala-version.sh 2.12
> 2.build with -Pscala-2.12
> ./dev/make-distribution.sh   --tgz -Pscala-2.12 -Phadoop-2.7  -Pyarn 
> -Pparquet-provided -Dhadoop.version=2.7.3
> get following error
> #Error1
> {code}
> /common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java:172: 
> error: cannot find       symbol
>     Cleaner cleaner = Cleaner.create(buffer, () -> freeMemory(memory));
> {code}
> This is because sun.misc.Cleaner has been moved to new location in JDK9. 
> HADOOP-12760 will be the long term fix
> #Error2
> {code}
> spark_source/core/src/main/scala/org/apache/spark/executor/Executor.scala:455:
>  ambiguous reference to overloaded definition, method limit in class 
> ByteBuffer of type (x$1: Int)java.nio.ByteBuffer
> method limit in class Buffer of type ()Int
> match expected type ?
>      val resultSize = serializedDirectResult.limit
> error                                         
> {code}
> The limit method was moved from ByteBuffer to the superclass Buffer and it 
> can no longer be called without (). The same reason for position method.
> #Error3
> {code}
> home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:415:
>  ambiguous reference to overloaded definition, [error] both method putAll in 
> class Properties of type (x$1: java.util.Map[_, _])Unit [error] and  method 
> putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: 
> Object])Unit [error] match argument types (java.util.Map[String,String])
>  [error]     properties.putAll(propsMap.asJava)
>  [error]                ^
> [error] 
> /home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:427:
>  ambiguous reference to overloaded definition, [error] both method putAll in 
> class Properties of type (x$1: java.util.Map[_, _])Unit [error] and  method 
> putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: 
> Object])Unit [error] match argument types (java.util.Map[String,String])
>  [error]       props.putAll(outputSerdeProps.toMap.asJava)
>  [error]             ^
>  {code}
>  This is because the key type is Object instead of String which is unsafe.
> After solving these 3 errors, compile successfully.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to