Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/22313#discussion_r214765115
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcFilters.scala
---
@@ -71,12 +71,24 @@ private[orc] object OrcFilters {
for {
// Combines all convertible filters using `And` to produce a single
conjunction
- conjunction <-
convertibleFilters.reduceOption(org.apache.spark.sql.sources.And)
+ conjunction <- buildTree(convertibleFilters)
--- End diff --
In parquet, this is done as
```
filters
.flatMap(ParquetFilters.createFilter(requiredSchema, _))
.reduceOption(FilterApi.and)
```
can we follow it?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]