dbtsai commented on a change in pull request #28366:
URL: https://github.com/apache/spark/pull/28366#discussion_r418766235



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -2063,16 +2063,17 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
-  val NESTED_PREDICATE_PUSHDOWN_ENABLED =
-    buildConf("spark.sql.optimizer.nestedPredicatePushdown.enabled")
-      .internal()
-      .doc("When true, Spark tries to push down predicates for nested columns 
and or names " +
-        "containing `dots` to data sources. Currently, Parquet implements both 
optimizations " +
-        "while ORC only supports predicates for names containing `dots`. The 
other data sources" +
-        "don't support this feature yet.")
+  val NESTED_PREDICATE_PUSHDOWN_V1_SOURCE_LIST =
+    buildConf("spark.sql.optimizer.nestedPredicatePushdown.v1sourceList")
+      .internal()
+      .doc("A comma-separated list of data source short names or fully 
qualified data source " +
+        "implementation class names for which Spark tries to push down 
predicates for nested " +
+        "columns and or names containing `dots` to data sources. Currently, 
Parquet implements " +
+        "both optimizations while ORC only supports predicates for names 
containing `dots`. The " +
+        "other data sources don't support this feature yet.")
       .version("3.0.0")
-      .booleanConf
-      .createWithDefault(true)
+      .stringConf

Review comment:
       In v1, we don't have any use-case for supporting it in custom data 
source. I'm good with it.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to