AngersZhuuuu commented on a change in pull request #24909: [SPARK-28106][SQL]
When Spark SQL use "add jar" , before add to SparkContext, check jar path
exist first.
URL: https://github.com/apache/spark/pull/24909#discussion_r302997487
##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -1792,12 +1792,36 @@ class SparkContext(config: SparkConf) extends Logging {
}
}
+ def addRemoteJarFile(path: String): String = {
+ val hadoopPath = new Path(path)
+ val scheme = new URI(path).getScheme
+ if (!Array("http", "https", "ftp").contains(scheme)) {
+ try {
+ val fs = hadoopPath.getFileSystem(hadoopConfiguration)
+ if (!fs.exists(hadoopPath)) {
+ throw new FileNotFoundException(s"Jar ${path} not found")
+ }
+ if (fs.isDirectory(hadoopPath)) {
+ throw new IllegalArgumentException(
+ s"Directory ${path} is not allowed for addJar")
+ }
+ path
+ } catch {
+ case NonFatal(e) =>
+ logError(s"Failed to add $path to Spark environment", e)
+ null
Review comment:
I am confused too, since for local file , it will just continue with a warn,
then return a null, the the wrong path won't be add to jar path collection.
Seems reasonable too.
It's reasonable to do like this for dfs file the same.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]