[GitHub] [spark] cloud-fan commented on a change in pull request #26773: [SPARK-30126][CORE] sparkContext.addFile and sparkContext.addJar fails when file path contains spaces

2019-12-10 Thread GitBox
cloud-fan commented on a change in pull request #26773: [SPARK-30126][CORE] 
sparkContext.addFile and sparkContext.addJar fails when file path contains 
spaces
URL: https://github.com/apache/spark/pull/26773#discussion_r356439923
 
 

 ##
 File path: core/src/test/scala/org/apache/spark/SparkContextSuite.scala
 ##
 @@ -233,6 +233,42 @@ class SparkContextSuite extends SparkFunSuite with 
LocalSparkContext with Eventu
 }
   }
 
+  test("SPARK-30126: addFile when file path contains spaces with recursive 
works") {
+withTempDir { dir =>
+  try {
+val sep = File.separator
+val tmpDir = Utils.createTempDir(dir.getAbsolutePath + sep + "test 
space")
+val tmpConfFile1 = File.createTempFile("test", ".conf", tmpDir)
 
 Review comment:
   can we also add space in the file name to prove it works?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] cloud-fan commented on a change in pull request #26773: [SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file path contains spaces

2019-12-10 Thread GitBox
cloud-fan commented on a change in pull request #26773: 
[SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file 
path contains spaces
URL: https://github.com/apache/spark/pull/26773#discussion_r356129876
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##
 @@ -1870,21 +1870,21 @@ class SparkContext(config: SparkConf) extends Logging {
   }
 }
 
-if (path == null) {
-  logWarning("null specified as parameter to addJar")
+if (path == null || path.isEmpty) {
+  logWarning("null or empty path specified as parameter to addJar")
 } else {
   val key = if (path.contains("\\")) {
 // For local paths with backslashes on Windows, URI throws an exception
 addLocalJarFile(new File(path))
   } else {
-val uri = new URI(path)
+val uri = new Path(path).toUri
 // SPARK-17650: Make sure this is a valid URL before adding it to the 
list of dependencies
 Utils.validateURL(uri)
 uri.getScheme match {
   // A JAR file which exists only on the driver node
   case null =>
 // SPARK-22585 path without schema is not url encoded
-addLocalJarFile(new File(uri.getRawPath))
+addLocalJarFile(new File(uri.getPath))
 
 Review comment:
   this change looks a little random to me: what's wrong with `getRawPath`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] cloud-fan commented on a change in pull request #26773: [SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file path contains spaces

2019-12-10 Thread GitBox
cloud-fan commented on a change in pull request #26773: 
[SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file 
path contains spaces
URL: https://github.com/apache/spark/pull/26773#discussion_r356129600
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##
 @@ -1870,21 +1870,21 @@ class SparkContext(config: SparkConf) extends Logging {
   }
 }
 
-if (path == null) {
-  logWarning("null specified as parameter to addJar")
+if (path == null || path.isEmpty) {
+  logWarning("null or empty path specified as parameter to addJar")
 } else {
   val key = if (path.contains("\\")) {
 // For local paths with backslashes on Windows, URI throws an exception
 addLocalJarFile(new File(path))
   } else {
-val uri = new URI(path)
+val uri = new Path(path).toUri
 
 Review comment:
   nit: `hadoopPath.toUri`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] cloud-fan commented on a change in pull request #26773: [SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file path contains spaces

2019-12-10 Thread GitBox
cloud-fan commented on a change in pull request #26773: 
[SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file 
path contains spaces
URL: https://github.com/apache/spark/pull/26773#discussion_r356128107
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##
 @@ -1526,7 +1526,7 @@ class SparkContext(config: SparkConf) extends Logging {
   def addFile(path: String, recursive: Boolean): Unit = {
 val uri = new Path(path).toUri
 val schemeCorrectedPath = uri.getScheme match {
 
 Review comment:
   This path string is only used to create a hadoop Path. Let me confirm: can 
we create hadoop Path with URI directly?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] cloud-fan commented on a change in pull request #26773: [SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file path contains spaces

2019-12-10 Thread GitBox
cloud-fan commented on a change in pull request #26773: 
[SPARK-30126][CORE]sparkContext.addFile and sparkContext.addJar fails when file 
path contains spaces
URL: https://github.com/apache/spark/pull/26773#discussion_r356128107
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##
 @@ -1526,7 +1526,7 @@ class SparkContext(config: SparkConf) extends Logging {
   def addFile(path: String, recursive: Boolean): Unit = {
 val uri = new Path(path).toUri
 val schemeCorrectedPath = uri.getScheme match {
 
 Review comment:
   This path string is only used to create a hadoop Path. Let me confirm: can 
we create hadoop Path with URI directly?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org