Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/15666#discussion_r143017806
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1845,6 +1859,21 @@ class SparkContext(config: SparkConf) extends
Logging {
logInfo(s"Added JAR $path at $key with timestamp $timestamp")
postEnvironmentUpdate()
}
+
+ if (addToCurrentClassLoader) {
+ val currentCL = Utils.getContextOrSparkClassLoader
+ currentCL match {
+ case cl: MutableURLClassLoader =>
+ val uri = if (path.contains("\\")) {
+ // For local paths with backslashes on Windows, URI throws
an exception
+ new File(path).toURI
+ } else {
+ new URI(path)
+ }
+ cl.addURL(uri.toURL)
+ case _ => logWarning(s"Unsupported cl $currentCL will not
update jars thread cl")
--- End diff --
I'd say`class loader` instead of `cl`.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]