squito commented on a change in pull request #24057: [SPARK-26839][WIP][SQL]
Work around classloader changes in Java 9 for Hive isolation
URL: https://github.com/apache/spark/pull/24057#discussion_r264309241
##########
File path: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
##########
@@ -329,10 +326,17 @@ private[spark] object HiveUtils extends Logging {
val classLoader = Utils.getContextOrSparkClassLoader
val jars = allJars(classLoader)
- if (jars.length == 0) {
- throw new IllegalArgumentException(
- "Unable to locate hive jars to connect to metastore. " +
- s"Please set ${HIVE_METASTORE_JARS.key}.")
+ if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)) {
+ // Do nothing. The system classloader is no longer a URLClassLoader in
Java 9,
+ // so it won't match the case in allJars above. It no longer exposes
URLs of
+ // the system classpath
Review comment:
I have something slightly different here -- admittedly I wasn't really sure
if this was the right thing to do or not, it was just a hack to let me get a
little further, not principled at all.
```
// For java 11, allJars() does *not* get jars on the classpath added,
so we add
// them here
val classPathJars = sys.props("java.class.path").split(":").map(new
File(_).toURI().toURL())
logWarning(s"Classpath jars = ${classPathJars.mkString(",")}")
val jars = allJars(classLoader) ++ classPathJars
```
I think this will solve your HiveClientImpl issue ... but that doesn't mean
its the right thing to do :P
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]