AngersZhuuuu commented on a change in pull request #24909: [SPARK-28106] check 
add jar path exist jar
URL: https://github.com/apache/spark/pull/24909#discussion_r295585522
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##########
 @@ -1799,6 +1799,20 @@ class SparkContext(config: SparkConf) extends Logging {
         // For local paths with backslashes on Windows, URI throws an exception
         addJarFile(new File(path))
       } else {
+        /**
+          * Check Path valid
+          */
+        val uriPath = new Path(path).toUri
+        val schemeCorrectedPath = uriPath.getScheme match {
+          case null => new File(path).getCanonicalFile.toURI.toString
+          case "local" => "file:" + uriPath.getPath
+          case _ => path
+        }
+        val hadoopPath = new Path(schemeCorrectedPath)
+        val fs = hadoopPath.getFileSystem(hadoopConfiguration)
+        if(!fs.exists(hadoopPath))
+          throw new FileNotFoundException(s"Jar ${schemeCorrectedPath} not 
found")
 
 Review comment:
   > If anything, why not check this below?
   When we use "ADD JAR" SQL command, it will call SessionResourceBuilder's 
addJar method.Then it call SparkContext's addJar method. It truly happen that 
when we add jar path with HDFS schema, it don't check . May be we can add this 
check in SessionResourceBuilder?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to