Github user xuanyuanking commented on a diff in the pull request:
https://github.com/apache/spark/pull/21533#discussion_r195132870
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1517,9 +1517,12 @@ class SparkContext(config: SparkConf) extends
Logging {
* only supported for Hadoop-supported filesystems.
*/
def addFile(path: String, recursive: Boolean): Unit = {
- val uri = new Path(path).toUri
+ var uri = new Path(path).toUri
val schemeCorrectedPath = uri.getScheme match {
- case null | "local" => new File(path).getCanonicalFile.toURI.toString
+ case null | "local" =>
+ // SPARK-24195: Local is not a valid scheme for FileSystem, we
should only keep path here.
+ uri = new Path(uri.getPath).toUri
--- End diff --
Yes, just as @felixcheung said, this because we will use uri in
https://github.com/apache/spark/pull/21533/files/f922fd8c995164cada4a8b72e92c369a827def16#diff-364713d7776956cb8b0a771e9b62f82dR1557,
if the uri with local scheme, we'll get an exception cause local is not a
valid scheme for FileSystem.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]