Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5708#discussion_r32719742
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -926,7 +926,9 @@ class SparkContext(config: SparkConf) extends Logging
with ExecutorAllocationCli
// The call to new NewHadoopJob automatically adds security
credentials to conf,
// so we don't need to explicitly add them ourselves
val job = new NewHadoopJob(conf)
- NewFileInputFormat.addInputPath(job, new Path(path))
+ // Use addInputPaths so that newAPIHadoopFile aligns with hadoopFile
in taking
+ // comma separated files as input. (see SPARK-7155)
+ NewFileInputFormat.addInputPaths(job, path)
--- End diff --
The problem is that the rest of the API already used `setInputPaths` so one
or the other behavior really needed to change in order to fix that. I think the
logic was that nobody _should_ have been relying on anything but the method arg
to set the path. I personally think it's less confusing to not have two ways to
specify a path. At this point though I think it would need a very good reason
to change the behavior again since it's not a question of fixing an
inconsistency anymore.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]