Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20851#discussion_r176762766
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
    @@ -353,6 +353,13 @@ object SQLConf {
         .booleanConf
         .createWithDefault(true)
     
    +  val PARQUET_FILTER_PUSHDOWN_DATE_ENABLED = 
buildConf("spark.sql.parquet.filterPushdown.date")
    +    .doc("If true, enables Parquet filter push-down optimization for Date. 
" +
    +      "This configuration only has an effect when 
'spark.sql.parquet.filterPushdown' is enabled.")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
    --- End diff --
    
    @yucai . The reason is that `spark.sql.orc.filterPushdown` is still `false` 
in Spark 2.3 while `spark.sql.parquet.filterPushdown` is `true`. We don't know 
this is safe or not.
    
    Anyway, we have 6 or more months for Apache Spark 2.4. We may enable this 
in `master` branch temporarily for testing purpose only, and are able to 
disable this at the last moment of 2.4 release like we did about ORC conf if 
there is some issue.
    
    BTW, did you use this in your company a lot in production?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to