07ARB commented on a change in pull request #26773:
[SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file
path contains spaces
URL: https://github.com/apache/spark/pull/26773#discussion_r355591065
##########
File path: core/src/test/scala/org/apache/spark/SparkContextSuite.scala
##########
@@ -294,6 +322,20 @@ class SparkContextSuite extends SparkFunSuite with
LocalSparkContext with Eventu
}
}
+ test("add jar when path contains spaces") {
+ withTempDir { dir =>
+ val sep = File.separator
+ val tmpDir = Utils.createTempDir(dir.getAbsolutePath + sep + "test
space")
+ val tmpJar = File.createTempFile("test", ".jar", tmpDir)
Review comment:
same thing for addJar and listJars
1. Previous addJar and listJars function behaviour :
```
addJar : sc.addJar("/Users/testdir/test3.jar")
listJars : sc.listJars
output : res1: Seq[String] = Vector(file:/Users/testdir/test3.jar)
```
2. After this PR addFile and listFile function behaviour :
```
addJar : sc.addJar("/Users/test dir/test3.jar") - here path contain space
listJars : sc.listJars
output : res1: Seq[String] = Vector(file:/Users/test%20dir/test3.jar) -
output will look like this (this is previous behaviour)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]