srowen commented on a change in pull request #24909: [SPARK-28106][SQL] When
Spark SQL use "add jar" , before add to SparkContext, check jar path exist
first.
URL: https://github.com/apache/spark/pull/24909#discussion_r303040492
##########
File path: core/src/test/scala/org/apache/spark/SparkContextSuite.scala
##########
@@ -165,6 +165,17 @@ class SparkContextSuite extends SparkFunSuite with
LocalSparkContext with Eventu
}
}
+ test("add hdfs jar files not exists") {
+ try {
+ val jarPath = "hdfs:///no/path/to/TestUDTF.jar"
+ sc = new SparkContext(new
SparkConf().setAppName("test").setMaster("local"))
+ sc.addJar(jarPath)
+ assert(sc.listJars().filter(_.contains("TestUDTF.jar")).size == 0)
Review comment:
Nit: How about `.forall(j => !j.contains("TestUDTF.jar"))`? or just check
`.filter(...).isEmpty`
I guess this is about the best that can be done for a test without an FS to
test against. So the behavior change here is that the bad path isn't added.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]