Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/19464#discussion_r144319503
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -270,6 +270,15 @@ package object config {
.longConf
.createWithDefault(4 * 1024 * 1024)
+ private [spark] val FILTER_OUT_EMPTY_SPLIT =
ConfigBuilder("spark.files.filterOutEmptySplit")
--- End diff --
Nit: no space after private
This doc is much too verbose for a flag. Just say, "If true, methods like
that use HadoopRDD and NewHadoopRDD such as SparkContext.textFiles will not
create a partition for input splits that are empty."
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]