Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/22597
  
    > In ParquetFilter, the way we test if a predicate pushdown works is by 
removing that predicate from Spark SQL physical plan, and only relying on the 
reader to do the filter.
    
    I haven't looked into, but the parquet record-level filtering is disabled 
by default, so if we remove predicates from spark side, the result can be wrong 
even if the predicates are pushed ro parquet.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to