[
https://issues.apache.org/jira/browse/SPARK-26839?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16777185#comment-16777185
]
Sean Owen commented on SPARK-26839:
-----------------------------------
Is this the same error? I'm seeing this while running Hive tests on Java 11:
{code}
[ERROR]
saveExternalTableWithSchemaAndQueryIt(org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite)
Time elapsed: 0.021 s <<< ERROR!
java.lang.IllegalArgumentException: Unable to locate hive jars to connect to
metastore. Please set spark.sql.hive.metastore.jars.
at
org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.tearDown(JavaMetastoreDataSourcesSuite.java:92)
[INFO] Running org.apache.spark.sql.hive.JavaDataFrameSuite
15:55:50.365 WARN org.apache.spark.sql.execution.command.DropTableCommand:
java.lang.IllegalArgumentException: Unable to locate hive jars to connect to
metastore. Please set spark.sql.hive.metastore.jars.
java.lang.IllegalArgumentException: Unable to locate hive jars to connect to
metastore. Please set spark.sql.hive.metastore.jars.
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:335)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:295)
at
org.apache.spark.sql.hive.test.TestHiveExternalCatalog.$anonfun$client$1(TestHive.scala:85)
at scala.Option.getOrElse(Option.scala:138)
at
org.apache.spark.sql.hive.test.TestHiveExternalCatalog.client$lzycompute(TestHive.scala:85)
at
org.apache.spark.sql.hive.test.TestHiveExternalCatalog.client(TestHive.scala:83)
at
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:217)
at
scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
at
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:217)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.databaseExists(ExternalCatalogWithListener.scala:69)
...
{code}
> on JDK11, IsolatedClientLoader must be able to load java.sql classes
> --------------------------------------------------------------------
>
> Key: SPARK-26839
> URL: https://issues.apache.org/jira/browse/SPARK-26839
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Imran Rashid
> Priority: Major
>
> This might be very specific to my fork & a kind of weird system setup I'm
> working on, I haven't completely confirmed yet, but I wanted to report it
> anyway in case anybody else sees this.
> When I try to do anything which touches the metastore on java11, I
> immediately get errors from IsolatedClientLoader that it can't load anything
> in java.sql. eg.
> {noformat}
> scala> spark.sql("show tables").show()
> java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError:
> java/sql/SQLTransientException when creating Hive client using classpath:
> file:/home/systest/jdk-11.0.2/, ...
> ...
> Caused by: java.lang.ClassNotFoundException: java.sql.SQLTransientException
> at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:230)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:219)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> {noformat}
> After a bit of debugging, I also discovered that the {{rootClassLoader}} is
> {{null}} in {{IsolatedClientLoader}}. I think this would work if either
> {{rootClassLoader}} could load those classes, or if {{isShared()}} was
> changed to allow any class starting with "java." (I'm not sure why it only
> allows "java.lang" and "java.net" currently.)
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]